Thursday, August 20, 2009

TH-F6A Radio Recorder using Mini-ITX

There's a few amateur satellites up in the sky that i'd like to listen to. The problem is that i'm usually neither awake nor home as they pass over. I could cart my radio and antenna around during the day but that's a pain and i'd get a lot of strange looks in the parking lot. I would probably also need a good excuse for being away from my desk for most of the day :-) Unfortunately, I can't be everywhere at once... But, I can surely program a machine to be there for me! Tonight, I hacked together a quick solution to the problem using the following:
  • an old mini-itx machine (EPIA1000)
  • my TH-F6A portable radio
  • the infamous arrow antenna
  • of course, the Python Programming Language
You can see the basic layout in the image below. I needed a way for getting the wav files off remotely, which is why you see the cicso NIC sticking out of the front. If you look closely, you can see a ton of scratches on the box. Believe it or not, I can't see any of these in person, the machine looks like it's been through a war in the photo!

Now, I needed a quick and effective way to record audio and store it off, pyaudio to the rescue! The code is not optimal in the sense that it waits until the end of the duration to writes the stream out to disk. I'm not very concerned about this as the LEO satellite passes are typically around 12 minutes and there's nothing else on the hard drive.

def record(filename, durationMinutes):

print 'recording for %s minutes' % (durationMinutes)
Format = pyaudio.paInt16
Chunk = 1024

Channels = 1
Rate = 44100
Seconds = 60 * durationMinutes
p = pyaudio.PyAudio()

stream = = Format,
channels = Channels,
rate = Rate,
input = True,
frames_per_buffer = Chunk)

all = []
for i in range(0, Rate / Chunk * Seconds):
data =


data = ''.join(all)
wf =, 'wb')
I put in some smarts using datetime and threading that would kick off a recording session at a specific time. I then wrote a quick snippet to kick it off for a test. Above, you can see that my radio was tuned in to NPR (i'm not a huge fan of NPR, but the music is good now and then).

def scheduleRecording(outputFilename, datetimeStart, durationMins):

print 'scheduled recording to file: ', outputFilename
print 'at : ', str(datetimeStart)
print 'for %s minutes.' % (durationMins)

now =
waitSeconds = datetimeStart - now

t = threading.Timer(waitSeconds.seconds,record,kwargs={'filename':outputFilename,'durationMinutes':durationMins})

if __name__ == '__main__':

filename = 'test.wav'
passTime = datetime.datetime(2009,8,20,22,30,0)

scheduleRecording(filename, passTime, 1)
The code is really rough but right now I just want it to work. It's Python after all, I can easily add a list of different file names, datetimes, etc... Now that this is ready, I need to find a sat pass to record. Ah, but wait! You're probably interested in seeing the final rig. Check it out below in all of its glory.

Now everything seems fine and dandy but that couldn't be further from the truth. Here's a few problems:

Problem 1- my HT is not programmable in real-time. This means that unless I manually put the radio in my hand and press buttons, I have no way of changing the frequency/tone/mode. No adjusting for doppler shift during the pass for this rig!!

Problem 2 - I do not have any sat tracking software installed, or own a rotator. As a result, my arrow will be sitting outside on an angle to catch what it can. Now, I do have a very nice camera tripod but with all the rain we've been getting lately, I have no way of protecting it.

This is a quick and dirty test bed that i'm going to be experimenting with for the next few days at least. A neat idea would be to leave it hooked up and record the LMARS net every evening and make the captures available in mp3 format! I may just do that.. . . But for now, it's off to find a satellite pass!

Sunday, August 9, 2009

2M JPole Testing

I hit the outside today to test out a 2M Jole Ed Evenson handed off to me at field day. It's a really nice unit he put together (and a very kind gift). I only had a few minutes just now but got it all strung up for a quick test. I'm going to see if I can catch anyone around this evening. I was able to trip the repeater at 5W and 1/2W and I only had this thing strung up maybe 9 ft at most. One thing I plan on grabbing today is a nice ladder!

I picked up the SMC / BNC adapter and smc bridge over at radio shack. I was actually reasonably surprised at the number of adapters they had available there in store. You can see the attachment piece and bridge connectors below, notta bad!

If all goes well, I should be back on the net this week.


Wednesday, August 5, 2009

Loading up the FCC License Database

I am putting up a site that requires access to a database containing the latest and greatest list of amateur radio licensees. This data is freely available directly from the FCCs website. They provide the data in a very clean format, consisting of a schema (txt file) and several data files (dat files). They also distribute PDFs that describe the data format in great detail. I hear people on TV and radio constantly taking stabs at the FCC for this or that. Well, i'd like to take a quick second and extend my thanks to the FCC, for providing this data in a simple format that is easy to obtain and Very well documented. Hats off to you FCC!

Now, the question is: What is the fastest way to get this data into a relational format ready to be used by a website? The data is only 355Mb raw, not much to deal with. There's most definitely several solutions to this problem in whatever language you choose. This blog posting will describe my solution, using SQLite and Python.

Importing the Schema

The schema txt file comes in a format that can not be used directly. Depending on your dbms, you'll need to hack at the formatting a bit. There are 'go' delimiters between each create table statement and none of the creates are semicolon delimited. I wanted to perform the database creation in a single 'executescript' call using sqlite, so I had to fix up the text. I wrote a few lines of script in Python that would fix up the data and import it into a new Sqlite database.
data = open(sqlFilename,'r').read()
#hacks: remove go tokens and close off create statements
# so this can be executed as a script
data = data.replace('go\n','')
data = data.replace(')\n',');\n')

con = sqlite3.connect(dbFilename)
con.execute("attach '%s' AS dbo;" % (dbFilename))
sqFilename can be something like "test.db" and dbFilename should be something like "pa_ddef44.txt". After executing the snippet, you'll end up with a complete sqlite database that's empty as can be. Now we need to get all our data imported. This is where I hit a stumbling block.

Importing the Data Files

I wanted to perform a pure-python implementation of the import, directly from csv (without resorting to use the sqlite3 executable). All the data files come in a "csv" like format, where | is the delimiter. The problem with this is, dbapi 2.0 does not promote a "Import from CSV" method (and that's a good thing). If I had to do this manually by creating the insert statements, it would have been necessary to manually deduce which fields for each table are varchar, then quote the varchar fields. I definitely did not want to deal with all that jazz, so I went ahead and just used the command line utility. Thankfully, sqlite has the .import command. I ended up using python anyway to script it, since there was some necessary mangling of the table name:
files = glob.glob('*.dat')
for file in files:

tableNameSuffix = os.path.splitext(os.path.basename(file))[0].upper()
tableName = 'PUBACC_' + tableNameSuffix

cmd = 'sqlite3 -separator "|" %s ".import %s %s' % (dbFilename,file,tableName)
print 'ex: ', cmd
print 'size: ', os.path.getsize(dbFilename)
The tableNameSuffix statement probably looks like a huge hack but the last portion of each table name comes from the file names. Rather nifty! So, what does this all look like at the end of the day? I went ahead and ran, here's the results:
created database...
size: 193536
ex: sqlite3 -separator "|" test.db ".import AM.dat PUBACC_AM
size: 193536
ex: sqlite3 -separator "|" test.db ".import CO.dat PUBACC_CO
size: 43587584
ex: sqlite3 -separator "|" test.db ".import EN.dat PUBACC_EN
size: 48450560
ex: sqlite3 -separator "|" test.db ".import HD.dat PUBACC_HD
size: 190622720
ex: sqlite3 -separator "|" test.db ".import HS.dat PUBACC_HS
size: 339019776
ex: sqlite3 -separator "|" test.db ".import LA.dat PUBACC_LA
size: 414668800
ex: sqlite3 -separator "|" test.db ".import SC.dat PUBACC_SC
size: 414686208
ex: sqlite3 -separator "|" test.db ".import SF.dat PUBACC_SF
size: 416502784
Voila! Done. So, now what can we do with all this? First off, where am I in all this?
sqlite> select * from PUBACC_AM where callsign = 'KJ4JIO';

That's me!! And the T stands for Technician. If we dig into another table, we can find out some more:

sqlite> select first_name, last_name, city from PUBACC_EN where call_sign='KJ4JIO';

Which makes me wonder, how many other Armbrusters out there have their amateur radio licenses? Let's find out!

sqlite> select first_name, last_name, city from PUBACC_EN where last_name = 'Armbruster';

Catherine,Armbruster,Palm City
Michael,Armbruster,Palm City

Quite a list indeed. Let's get some more useful information, how about when each of these individuals were granted their licenses?

sqlite> select first_name, last_name, city, grant_date from PUBACC_EN inner join PUBACC_HD on PUBACC_EN.unique_system_identifier = PUBACC_HD.unique_system_identifier where last_name = 'Armbruster';

Catherine,Armbruster,Palm City,03/27/2001
Michael,Armbruster,Palm City,03/27/2001

Problem solved. I can now dig through all the amateur radio licensees of the US and generate some potentially useful results. It would be a fun task to integrate this with Google earth or maps to get a view of all amateur radio operators in the US. In fact, that would make an excellent topic for a future posting.


Tuesday, August 4, 2009

ArcObjects Moment of the Day

For those of you that deal with arcmap often, I figured I would pass this on. If you do any raster processing in arc and end up getting random errors when processing the raster, pay close attention to this page:

I quote:
Some characters are not allowed in the name of an output raster.


# Special characters that are not allowed explicitly are:

( (open parenthesis)
) (close parenthesis)
{ (open brace)
} (close brace)
[ (open bracket)
] (close bracket)
\ (backslash)
~ (tilde)
' (single quote)
" (double quote)
, (comma)
' ' (space)

Pay close attention to that "space"... So, next time you get random errors and have a space in your filename, try putting your raster in C:\Temp or something to see if it fixes it. . .