Tuesday, October 6, 2009

Where is my satellite - in Python

Mark (K6HX) had suggested that using PyEphem would be easy.... and, well it is! I worked off his example here to create my own utility that would poll for updates as time ticks on and on. It's nothing fancy by any means but it got me digging into the code a bit. Here are the results:

# Script to sit and poll for upcoming satellite passes.
# Joseph Armbruster

# TODO(joe) : clean up these imports
import datetime
import ephem
import math
import os
import sys
import time
import urllib2

# TODO(joe) : add getopt configuration options for this to main.
_latlong = ('28.5340','-81.2594') # user lat/long
_notify = 30 # let us know this many minutes in advance to a pass
_usevoice = True # use voice?
_statussleep = 1 # how many minutes to sleep between status updates

def GetTLEs():
'''GetTLEs(): returns a list of tuples of keps for each satellite.
This function currently relies on a url from amsat.org.'''
# grab the latest keps
tles = urllib2.urlopen('http://www.amsat.org/amsat/ftp/keps/current/nasabare.txt').readlines()

# strip off the header tokens and newlines
tles = [item.strip() for item in tles]

# clean up the lines
tles = [(tles[i],tles[i+1],tles[i+2]) for i in xrange(0,len(tles)-2,3)]

return tles

if __name__ == '__main__':

observer = ephem.Observer()
observer.lat = _latlong[0]
observer.long = _latlong[1]

tles = GetTLEs()

while 1:
now = datetime.datetime.now()

# iterate through all the two line element sets
for tle in tles:

sat = ephem.readtle(tle[0],tle[1],tle[2])

rt, ra, tt, ta, st, sa = observer.next_pass(sat)
observer.date = rt
sat.compute(observer)
localrisetime = ephem.localtime(rt)

timeuntilrise = localrisetime-now

minutesaway = timeuntilrise.seconds/60.0
if minutesaway <= _notify:

if _usevoice and sys.platform=='darwin':
say = 'say "%s WILL BE MAKING A PASS IN %d MINUTES."' % (tle[0],minutesaway)
os.system(say)

print tle[0]
print ' Rise Azimuth: ', ra
print ' Transit Time: ', tt
print ' Transit Altitude: ', ta
print ' Set Time: ', st
print ' Set Azimuth: ', sa

time.sleep(60*_statussleep)

I learned a very valuable lesson about setting the observers lat/long values to a double. That's a REALLY bad idea and a rather difficult bug to find!

Enjoy,
Joe

Thursday, August 20, 2009

TH-F6A Radio Recorder using Mini-ITX

There's a few amateur satellites up in the sky that i'd like to listen to. The problem is that i'm usually neither awake nor home as they pass over. I could cart my radio and antenna around during the day but that's a pain and i'd get a lot of strange looks in the parking lot. I would probably also need a good excuse for being away from my desk for most of the day :-) Unfortunately, I can't be everywhere at once... But, I can surely program a machine to be there for me! Tonight, I hacked together a quick solution to the problem using the following:
  • an old mini-itx machine (EPIA1000)
  • my TH-F6A portable radio
  • the infamous arrow antenna
  • of course, the Python Programming Language
You can see the basic layout in the image below. I needed a way for getting the wav files off remotely, which is why you see the cicso NIC sticking out of the front. If you look closely, you can see a ton of scratches on the box. Believe it or not, I can't see any of these in person, the machine looks like it's been through a war in the photo!


Now, I needed a quick and effective way to record audio and store it off, pyaudio to the rescue! The code is not optimal in the sense that it waits until the end of the duration to writes the stream out to disk. I'm not very concerned about this as the LEO satellite passes are typically around 12 minutes and there's nothing else on the hard drive.

def record(filename, durationMinutes):

print 'recording for %s minutes' % (durationMinutes)
Format = pyaudio.paInt16
Chunk = 1024

Channels = 1
Rate = 44100
Seconds = 60 * durationMinutes
p = pyaudio.PyAudio()

stream = p.open(format = Format,
channels = Channels,
rate = Rate,
input = True,
frames_per_buffer = Chunk)

all = []
for i in range(0, Rate / Chunk * Seconds):
data = stream.read(Chunk)
all.append(data)

stream.close()
p.terminate()

data = ''.join(all)
wf = wave.open(filename, 'wb')
wf.setnchannels(Channels)
wf.setsampwidth(p.get_sample_size(Format))
wf.setframerate(Rate)
wf.writeframes(data)
wf.close()
I put in some smarts using datetime and threading that would kick off a recording session at a specific time. I then wrote a quick snippet to kick it off for a test. Above, you can see that my radio was tuned in to NPR (i'm not a huge fan of NPR, but the music is good now and then).

def scheduleRecording(outputFilename, datetimeStart, durationMins):

print 'scheduled recording to file: ', outputFilename
print 'at : ', str(datetimeStart)
print 'for %s minutes.' % (durationMins)

now = datetime.datetime.now()
waitSeconds = datetimeStart - now

t = threading.Timer(waitSeconds.seconds,record,kwargs={'filename':outputFilename,'durationMinutes':durationMins})
t.start()


if __name__ == '__main__':

filename = 'test.wav'
passTime = datetime.datetime(2009,8,20,22,30,0)

scheduleRecording(filename, passTime, 1)
The code is really rough but right now I just want it to work. It's Python after all, I can easily add a list of different file names, datetimes, etc... Now that this is ready, I need to find a sat pass to record. Ah, but wait! You're probably interested in seeing the final rig. Check it out below in all of its glory.


Now everything seems fine and dandy but that couldn't be further from the truth. Here's a few problems:

Problem 1- my HT is not programmable in real-time. This means that unless I manually put the radio in my hand and press buttons, I have no way of changing the frequency/tone/mode. No adjusting for doppler shift during the pass for this rig!!

Problem 2 - I do not have any sat tracking software installed, or own a rotator. As a result, my arrow will be sitting outside on an angle to catch what it can. Now, I do have a very nice camera tripod but with all the rain we've been getting lately, I have no way of protecting it.

This is a quick and dirty test bed that i'm going to be experimenting with for the next few days at least. A neat idea would be to leave it hooked up and record the LMARS net every evening and make the captures available in mp3 format! I may just do that.. . . But for now, it's off to find a satellite pass!

Sunday, August 9, 2009

2M JPole Testing

I hit the outside today to test out a 2M Jole Ed Evenson handed off to me at field day. It's a really nice unit he put together (and a very kind gift). I only had a few minutes just now but got it all strung up for a quick test. I'm going to see if I can catch anyone around this evening. I was able to trip the repeater at 5W and 1/2W and I only had this thing strung up maybe 9 ft at most. One thing I plan on grabbing today is a nice ladder!

I picked up the SMC / BNC adapter and smc bridge over at radio shack. I was actually reasonably surprised at the number of adapters they had available there in store. You can see the attachment piece and bridge connectors below, notta bad!

If all goes well, I should be back on the net this week.

73's


Wednesday, August 5, 2009

Loading up the FCC License Database

I am putting up a site that requires access to a database containing the latest and greatest list of amateur radio licensees. This data is freely available directly from the FCCs website. They provide the data in a very clean format, consisting of a schema (txt file) and several data files (dat files). They also distribute PDFs that describe the data format in great detail. I hear people on TV and radio constantly taking stabs at the FCC for this or that. Well, i'd like to take a quick second and extend my thanks to the FCC, for providing this data in a simple format that is easy to obtain and Very well documented. Hats off to you FCC!

Now, the question is: What is the fastest way to get this data into a relational format ready to be used by a website? The data is only 355Mb raw, not much to deal with. There's most definitely several solutions to this problem in whatever language you choose. This blog posting will describe my solution, using SQLite and Python.

Importing the Schema

The schema txt file comes in a format that can not be used directly. Depending on your dbms, you'll need to hack at the formatting a bit. There are 'go' delimiters between each create table statement and none of the creates are semicolon delimited. I wanted to perform the database creation in a single 'executescript' call using sqlite, so I had to fix up the text. I wrote a few lines of script in Python that would fix up the data and import it into a new Sqlite database.
data = open(sqlFilename,'r').read()
#hacks: remove go tokens and close off create statements
# so this can be executed as a script
data = data.replace('go\n','')
data = data.replace(')\n',');\n')

con = sqlite3.connect(dbFilename)
con.execute("attach '%s' AS dbo;" % (dbFilename))
con.executescript(data)
con.commit()
con.close()
sqFilename can be something like "test.db" and dbFilename should be something like "pa_ddef44.txt". After executing the snippet, you'll end up with a complete sqlite database that's empty as can be. Now we need to get all our data imported. This is where I hit a stumbling block.

Importing the Data Files

I wanted to perform a pure-python implementation of the import, directly from csv (without resorting to use the sqlite3 executable). All the data files come in a "csv" like format, where | is the delimiter. The problem with this is, dbapi 2.0 does not promote a "Import from CSV" method (and that's a good thing). If I had to do this manually by creating the insert statements, it would have been necessary to manually deduce which fields for each table are varchar, then quote the varchar fields. I definitely did not want to deal with all that jazz, so I went ahead and just used the command line utility. Thankfully, sqlite has the .import command. I ended up using python anyway to script it, since there was some necessary mangling of the table name:
files = glob.glob('*.dat')
for file in files:

tableNameSuffix = os.path.splitext(os.path.basename(file))[0].upper()
tableName = 'PUBACC_' + tableNameSuffix

cmd = 'sqlite3 -separator "|" %s ".import %s %s' % (dbFilename,file,tableName)
print 'ex: ', cmd
print 'size: ', os.path.getsize(dbFilename)
os.system(cmd)
The tableNameSuffix statement probably looks like a huge hack but the last portion of each table name comes from the file names. Rather nifty! So, what does this all look like at the end of the day? I went ahead and ran, here's the results:
created database...
size: 193536
ex: sqlite3 -separator "|" test.db ".import AM.dat PUBACC_AM
size: 193536
ex: sqlite3 -separator "|" test.db ".import CO.dat PUBACC_CO
size: 43587584
ex: sqlite3 -separator "|" test.db ".import EN.dat PUBACC_EN
size: 48450560
ex: sqlite3 -separator "|" test.db ".import HD.dat PUBACC_HD
size: 190622720
ex: sqlite3 -separator "|" test.db ".import HS.dat PUBACC_HS
size: 339019776
ex: sqlite3 -separator "|" test.db ".import LA.dat PUBACC_LA
size: 414668800
ex: sqlite3 -separator "|" test.db ".import SC.dat PUBACC_SC
size: 414686208
ex: sqlite3 -separator "|" test.db ".import SF.dat PUBACC_SF
size: 416502784
Voila! Done. So, now what can we do with all this? First off, where am I in all this?
sqlite> select * from PUBACC_AM where callsign = 'KJ4JIO';
AM,3077751,,,KJ4JIO,T,D,4,,,,,,,,,,

That's me!! And the T stands for Technician. If we dig into another table, we can find out some more:

sqlite> select first_name, last_name, city from PUBACC_EN where call_sign='KJ4JIO';
Joseph|Armbruster|Orlando


Which makes me wonder, how many other Armbrusters out there have their amateur radio licenses? Let's find out!

sqlite> select first_name, last_name, city from PUBACC_EN where last_name = 'Armbruster';

John,Armbruster,Erma
Catherine,Armbruster,Palm City
Michael,Armbruster,Palm City
John,Armbruster,Denver
Joseph,Armbruster,Orlando
Kirsten,Armbruster,Denver


Quite a list indeed. Let's get some more useful information, how about when each of these individuals were granted their licenses?

sqlite> select first_name, last_name, city, grant_date from PUBACC_EN inner join PUBACC_HD on PUBACC_EN.unique_system_identifier = PUBACC_HD.unique_system_identifier where last_name = 'Armbruster';

John,Armbruster,Erma,08/21/2003
Catherine,Armbruster,Palm City,03/27/2001
Michael,Armbruster,Palm City,03/27/2001
John,Armbruster,Denver,02/19/2003
Joseph,Armbruster,Orlando,02/11/2009
Kirsten,Armbruster,Denver,03/12/2009


Problem solved. I can now dig through all the amateur radio licensees of the US and generate some potentially useful results. It would be a fun task to integrate this with Google earth or maps to get a view of all amateur radio operators in the US. In fact, that would make an excellent topic for a future posting.

Cheers!

Tuesday, August 4, 2009

ArcObjects Moment of the Day

For those of you that deal with arcmap often, I figured I would pass this on. If you do any raster processing in arc and end up getting random errors when processing the raster, pay close attention to this page:

I quote:
Some characters are not allowed in the name of an output raster.

...

# Special characters that are not allowed explicitly are:

( (open parenthesis)
) (close parenthesis)
{ (open brace)
} (close brace)
[ (open bracket)
] (close bracket)
\ (backslash)
~ (tilde)
' (single quote)
" (double quote)
, (comma)
' ' (space)

Pay close attention to that "space"... So, next time you get random errors and have a space in your filename, try putting your raster in C:\Temp or something to see if it fixes it. . .

Awesome.

Friday, July 31, 2009

python-twitter, why not?

Why Not?

I have recently started to experience Facebook and twitter, living the life of a truly 'connected' geek. But as you all could have guessed, being connected isn't all that interesting of an idea to me. In the end, what matters the most is the Code. How can I interact with twitter? There is a plethora of libraries available for interacting with Twitter. I heart Python and decided on hacking at python-twitter. The Scripting with Curl article linked off of Twitter was rather informative and provided me a reasonably good amount of background. If interested, most everything done in the article can be replicated with urllib2 in Python, and that's good to know. :-) I needed to install python-twitter and had the choice for installing from source or using easy_install. I'm on windows at the moment and have become use to using easy_install whenever possible, so I simply executed:
easy_install python-twitter
Post install, I got a warm fuzzy feeling executing:
import twitter
api = twitter.Api()
ooooh, ahhhh, I already feel more connected..... I used the api reference here as my introduction to the api. I usually like to toy a bit at the interactive interpreter to see what types of methods / attributes are on certain objects and see if I can figure things out on my own. First thing I noticed was there is a plethora of help available: help(twitter.Api), help(api.GetDirectMessage), etc... That was a +. I also noticed that upon executing api.GetDirectMessages() out of line, I received a good error message with a human-readible description. Yet another huge +. I like to look for things like this from the start, to see how reasonably the API is.

Getting my list of Followers

I decided to attack a simple problem: get my list of followers. I signed up to twitter around Aug 2008 but never used it or told anyone I had an account. I only started using it seriously the last week. So, if I try to do GetFollowers just to see what happens and...

>>> api.GetFollowers()
Traceback (most recent call last):
File "", line 1, in
File "build\bdist.win32\egg\twitter.py", line 1597, in GetFollowers
twitter.TwitterError: twitter.Api instance must be authenticated
Woot! Another good error message. So, let's get authenticated. Upon browsing dir(api), I found a SetCredentials method, which sounds like what I was searching for. The help provided a simple and straight forward explanation of the call:
api.SetCredentials('myusername','mypassword')
All there is to it, no hackin at headers required. Can I get my followers now?
>>> len(api.GetFollowers())
15
I only have a few followers at this point in time, perhaps you can be another? It was far too easy to do this, yes/ no? Notice above I was getting the length of the list returned by GetFollowers. This call returns a List of twitter.User objects that allows you to obtain further details on the user of interest. To get the screen names of all my followers, all that needs to be done is the following:

followers = api.Followers()
for follower in followers:
print follower.screen_name

And the listing is provided. The are several functions available in the api that simplify interacting with twitter. You can trivially:
  • get different user status
  • get your direct messages
  • get replies
  • get a friends timeline
  • post a message
  • post an update (or updates)
It's pretty slick. Yet another case of Python, along with a nice API making my life simpler.

I'll probably follow up this article at a later time with a more detailed look into it but hopefully you enjoyed the brief.

Yet Another ArcObjects API Adventure

If you develop software with ArcObjects long enough, you will come to learn that experience is what matters the most. The Developer docs that come distributed with the SDK are only the first stop to finding out the real story of what to do with an ArcObject. I've been bitten several times by different ArcObjects and today is yet another one of those times. When it comes to doing GeoProcessing, the idea behind it is simple.
  1. Create a Geoprocessor
  2. Set some properties
  3. Create a geoprocessing tool instance
  4. Set some propertites
  5. Call the Geoprocessor with the instance of the tool
  6. Done
It's cake, at least in theory. Today, I had the need to do some geoprocessing using the ExtractByPolygon operation. According to the docs, the polygon property can be set with an "object" that is a polygon:
Polygon that defines the area to be extracted. X,Y coordinates define the vertices of the polygon. (In, Required)
I proceed to do what feels like the right thing to do. Create my PolygonClass, start creating a square that represents my extents and voila!
Failed to execute (Extract by Polygon). Parameters are not valid. Invalid value type for parameter polygon
That's actually a good error message! Now, I have seen this type of thing before with using the geoprocessing library and new exactly what to do; Go to the support forums and start searching. I was bit by a different operation once, that took a string as the object parameter. If you didnt put a literal "; " at the end of the string the operation would fail (without any indication of why). So I began to adventure into the infamous forums. I dug up an old desktop help article from 9.1 here and noticed the way they were calling it in a script:
ExtractByPolygon_sa C:/data/raster1 "0 0;1 1;2 2;3 3;4 4;5 3;6 2;7 1;8 0" C:/data/final_1 OUTSIDE
I read the description on this page which states this for the polygon parameter:

Polygon that defines the area to be extracted. X,Y coordinates define the vertices of the polygon.

Yes, completely useless. I could not rely on the description, just the random example that was typed up here. Then I get to creating a string with similar content and it all just works:

string strPolygon = String.Format("{0} {1};{2} {3};{4} {5};{6} {7}", left, bottom, left, top, right, top, right, bottom);

I really wish there was a moral to this story. I can't think of a good one here besides the age old developer gripe: "Documentation sucks".

ArcObjects Get Raster Extents (Properties)

I wanted to grab a rasters extents and was trying to avoid opening a RasterDataset and going through all the workspace stuffs. Experience has taught me to always look into the DataManagementTools before banging my head. Low and behold, the GetRasterProperties class was found.
Geoprocessor geoprocessor = new Geoprocessor();

GetRasterProperties rasterProperties = new GetRasterProperties();
rasterProperties.in_raster = rasterFilename;
rasterProperties.property_type = "TOP";
geoprocessor.Execute(rasterProperties,null);
double top = rasterProperties.property;

rasterProperties.property_type = "BOTTOM";
geoprocessor.Execute(rasterProperties,null);
double bottom = rasterProperties.property;

rasterProperties.property_type = "LEFT";
geoprocessor.Execute(rasterProperties,null);
double left = rasterProperties.property;

rasterProperties.property_type = "RIGHT";
geoprocessor.Execute(rasterProperties,null);
double right = rasterProperties.property;

I really wish there was a way to return more than one property, seems kindof inefficient doing it this way. Maybe i'll go back and do it the other way, using the IRaster2 interface. Grrr...

Thursday, July 30, 2009

Embedding Images in Word (Python/C#/VB)

I wrote a utility that exports trac wiki content into static html for the purposes of dumping it into a word document (for the powers that be). The Microsoft Word selection.InsertFile method turned out to be quite a lifesaver for this. My exporter utility does some hacking up of the urls so that bookmarks can be created for the hyperlinks. This makes it easy to navigate the word doc since the user can click the URLs in the doc and it will take them to the appropriate page/ paragraph of the content. It all worked out rather nicely, about 50 lines of python for the main word-doc-generation code. When all is said and done, I handed off the document to others to read and admire. After doing this, I was informed that all the images were missing. I found that hard to believe, since I could clearly see the images on my machine. So I looked into it and found out the InsertFile method does not cause images to be embedded within the word doc. As a result, the only machine the doc worked on was mine, since I had all the images within the docs folder..... nice. I could not find any nice one-stop method for telling the word doc to "save all images into this doc for me". That's a pain. I posted up an inquiry to Microsoft Discussion Groups and waited over night for an answer but nothing good came through. After waiting another day, a Jay Freedman gave me a tip. I ended up hacking it together and using the code provided below:

for shape in doc.InlineShapes:
if not shape.LinkFormat is None:
filename = shape.LinkFormat.SourceFullName
shape.LinkFormat.SavePictureWithDocument = True
shape.LinkFormat.BreakLink()


Efficient or not, it works. With that being said, if any of you know of a cleaner way to accomplish this, please let me know!

Testing 555 timers is FUN

I had a few old 555 timers sitting around and got to wondering if they were any good. So, I had a bit of fun putting this tester together.



I think it's one of those things anyone who works with 555 timers should have around the shop... just in case. Thanks Tony Van Room!

Thursday, May 28, 2009

Trac Wiki Add Attachment Programatically

Yesterday, I was surprised to find out that there is no utility in trac-admin for adding/deleting attachments from a wiki page. I wanted a simple test script to accomplish this. I took a look at the moin2trac.py script and ended up ripping out a small part of it in order to put together a minimal test script. This was tested using trac-0.11.4 with Python 2.5.
import os
from trac.attachment import Attachment
from trac.admin.console import TracAdmin

wikidir = r'C:\Path\To\My\TracProject'

admin = TracAdmin()
admin.env_set(wikidir)

filename = r'c:\Path\To\My\Images\2_0.jpg'

attachment = Attachment(admin.env_open(), 'wiki', 'tutorials/page1')

size = os.stat(filename)[6]
attfile = open(filename,'rb')

attachment.insert(os.path.basename(filename), attfile, size)

The parameters should be reasonably self-explanatory, although do not hesitate to send any questions. I'm thinking it would be reasonably simple to add this functionality to trac-admin. I will probably take a look at this sometime soon and submit a patch.

Cheers!

Sunday, May 17, 2009

Emma - Future Hacker

My wife and I waited in the car today while my buddy Adam went into Borders for a few. He ended up coming out with a rather interesting baby gift, namely a copy of this seasons 2600 Hacker Quarterly. To our surprise, it has a picture of a baby operator named Emma, complete with a headset! Quite a coincidence seeing that our little Emma is on her way.


Is this a sign of things to come? Perhaps a future hacker? Either this is one really neat coincidence, or someone in cahoots with Emmanuel Goldstein.

insert spooky music here...

Monday, May 4, 2009

Bone Discovery - Case Closed (sort of)

Previously, I had concluded that the bones belonged to a canine. If there is one thing that I have learned throughout my life, it's that Experience Counts. So I set out to ask some individuals with industry experience. I ended up sending out a couple of emails to anthropology professors at the University of Central Florida in search of a second opinion.

John Schultz, a Ph.D. from the Department of Anthropology was the first to respond. He first indicated that I would not need to search for a microchip out in the woods. He then indicated that the partial skeleton had belonged to a juvenile deer, that was most likely hit by a car.

Without prior knowledge and looking solely at the spine and pelvis, it does seem as if the bones could belong to a canine. However, after looking at a deer skeleton and seeing the size of the femurs, things become much clearer. My casual attempt at classifying the bones turned out to be incorrect. I was dissapointed in myself for not conducting a more detailed analysis. On the other hand I was reasonably relieved by this, since I would not have to search for that microchip in the brush!

Some people like to see a glass as half empty or half full. I am one of those people that are just happy the glass exists. I'm happy this little deer and I came to cross paths as we did. Life has an interesting way of making things happen sometimes. This little deer had no clue that someone named Joe would eventually be sifting through the brush for it's bones. But someone named Joe did and learned quite a bit from it!

Sunday, May 3, 2009

Bone Discovery - More Findings, More Photos

I walked my dog this afternoon and took another visit to the site. Unfortunately, my camera batteries were dead and I didn't realize this until trying to use it, so I was unable to take photos at the site. I searched around high and low for more specimens. I found several small pieces of bone and cartilage around the hole. I searched around within a 5 meter radius of the site and was unable to find any more large pieces. On my walk back however, I moved very slowly and was able to find one more vertebrae. The image below is a summary of my finds for today.


Since dinner was finishing up when I brought them in, I figured it would be better to just wait until after dinner. I cleaned them off with hot water and now they are busy drying. In the mean time, here's a few shots of some of the other bones discovered yesterday.


This is the pelvis. While I was cleaning it, the bone cracked straight through the center. I was really upset about this. In the photo above, the two pieces were leaned up against each other to make it look uniform while taking the picture.


The picture above shows all the detached ribs that I found. These are not displayed in any particular order but were rather aligned so as to minimize the number of photos I needed to take of them.


Here is a picture of four teeth. All-together, I could identify five teeth but the picture above is missing one of them.


These are a series of small joints that I did not align to any particular bone. It should be fairly easy to do so but i'm pretty tired right now. You can see how these fit together in the image below.


This is a collection of various bones, joints, smaller teeth? and pieces of cartilage.


This is the scapula, that is located near the top of the front legs on canines.


I am going to be reasonably busy this week and i'm not certain when I will be able to make another pass at the area to look around. So, in the mean time, you can sift through my previous bone-related posts.

Cheers!

Saturday, May 2, 2009

Bone Discovery - Putting some pieces together

If this is a canine, I believe these would be called the Lumbar Vertebrae. Below, you can see a close up shot of one of the vertebrae.

I have uploaded the highest quality images, so you should be able to click on the image and zoom in effectively. After about 10 minutes of trial and error, I was able to assemble all the vertebrae into a complete column combined with the tail bone.


It's appearing to come together rather nicely. I attempted to continue by assembling the thoractic vertebrae, however it has turned out to be a reasonably difficult task. You can see the individual specimens lined up on the desk above the yellow paper. The pieces that have a half rib attached are also difficult to piece together. I was able to find one that lines up nicely with the existing column and placed it into position. Fortunately, this piece has one of the ribs attached and provides a better picture of the animal.


As these pieces are reasonably difficult to pieces together, I decided it was time to do a little research. I needed a frame of reference, so I looked up a skeletal diagram for felines, racoons and a canine. The closes match I found was a canine, so I was right!! The reference skeleton I used can be foung here:

http://www.jorvet.com/catalog/images/products/289_309_Pictures/J774.gif

The skeletal structure I have been examining has a pelvis that strongly resembles the one above. In addition, the spinal column lines up quite similarly. The femurs also appear to be anatomically correct. I would venture to say that there is sufficient evidence that this animal was in fact a canine. While lining up the femurs with the pelvis, I made an interesting observation. The right femur is what I looked at first and it lined up great. The left femur however appears to have undergone a compound fracture. The image below show the Right femur on the top and the Left femur on the bottom.

The image below provides a close up of the compound fracture.


In addition, two hairline fractures can be noted on the right femur. Each of these fractures are on opposing sides of the bone. Below are two images of each of the hairline fractures.


The area this specimen was discovered is roughly 3 meters distance from a side street that cars typically fly down. I am going to venture to say this canine was hit by a car and managed to work its way towards the brush. While in pain, it decided to sit down in the shade underneath the trees where I found it. I did not think about this at first but in the place that I discovered the bones, there was a hole dug out. It was roughly a foot or so deep and all the bones were found centered around it. Most bones were found right in the vicinity of this hole and around the outside of it. I'm wondering if the dog was in severe pain and dug out the hole? When dogs dig, they have quite a bit of weight on their hind legs and move their front legs rapidly in the regular digging motion. This would require putting a bit of weight on the heind legs. This theory doesn't seem to add up, since at least one leg was severely damaged. So now i'm really confused about where the hole came from and why?? I'll provide some photos tomorrow of the surrounding area. Perhaps you can help me solve this mystery.

Bone Discovery - To be or not to be a Scientist

I graduated from the University of Central Florida with a BS in Computer Science and a minor in mathematics. By definition, that makes me a scientist and naturally scientists are curious about things. Anyone who knows me is aware of how curious I am about all sorts of things. Today, curiosity got the best of me and hence this blog post. I walk my dog most every evening while participating in the LMARS net (on 174.285 for those interested). I take different paths each night while walking but most nights I pass a wooded area. Recently, I noticed a set of bones that were roughly 2 meters in from the sidewalk and reasonably exposed to the elements. There was a hole dug out in between where they were resting, which did not make much sense to me. I will head out to the area of interest tomorrow to take some photos so that everyone can see what i'm talking about. Not too long ago, this animal was an eating, breathing, living being that was busy doing its business. What type of animal is it? How did it die? Why did the animal die at this location? I started asking myself all sorts of questions and decided to start searching for some answers. After examining the bones for a few more minutes, I decided I should pack up my pooch, head back to my apartment, grab a box, some gloves and start collecting the specimen. That's exactly what I did. I excavated all sorts of small bones and fragments from the site. One of the leg bones was roughly 3 meters away from the rest of the bones and I found several ribs roughly 2+ meters away. I am expecting other animals (vultures) probably got to chewing on some of the remains and carried these bones a distance from the main site. I had to sift through all sorts of foliage, dirt, branches, etc to pull them out but i'm reasonably satisfied with my findings. All in all, I spent roughly 30 minutes collecting. Unfortunately, I was unable to find the skull !! I am going to make a second attempt at the area tomorrow to see if it was dragged away.

I placed all the bones into a cardboard box and carried them home. I then proceeded to wash them by hand in warm water, wearing gloves. I was careful to keep all bones that were still attached with flesh together. Some of the pieces detached during cleaning, including the hip bone. After being washed, I put all the bones on a piece of plastic on top of napkins to dry out. You can see the results below.


As stated above, i'm a computer scientist, not an anthropologist. My mom did teach anthropology in college and I learned a bit about it when I was a kid. Outside of that, i've never taken a course and have no 'professional experience' in such things, so if you or someone you know can shed some light on this topic, please let me know. I was able to identify what appeared to be the following bones immediately.
  • several ribs
  • several vertebrae
  • several throacic vertebrae
  • a scapula
  • front leg bones
  • heind leg bones
  • tail bone
  • pelvis
  • several teeth
If I were to make an 'un-educated' guess at this time, I would wager on canine. But that's enough guessing games for me, it's time to get back to the table and start fitting the pieces together!

Tuesday, April 14, 2009

FreeDOS Virtual Machine

Hacking at legacy apps is always an adventure and this evening it was an enjoyable one. I just had the opportunity to set up a FreeDOS virtual machine using the fdfullcd.iso image. It was a very pleasurable experience, so hats off to you FreeDOS team. I haven't run into any issues up to now but I will be sure to report any if I do.

It's just.... beautiful....

Anchors in Trac

I had the need to use anchors today within a single page in Trac. The basic idea was to have a table of contents at the top of the page that would allow the user to link to subsections within the same page. After reading through the wikiformatting pages and google, it turns out the answer is quite simple. All of the headings are turned into anchors, whose names are the heading tokens with the spaces removed. For example, the following heading:
= My Heading =
Would be translated into the following markup:
<h2> id="MyHeading"&gtMy Heading</h2>
Within the contents section, you can link to a specific anchor via:
[wiki:somesite#MyHeading Links to My Heading Anchor]
So it turned out to be reasonably simple, just not so obvious to me at first.

Sunday, April 12, 2009

Scraping Amsat Satellite Status Data

I recently had an interest in scraping out the satellite summary and band information from the Amsat website. As usual, python and beautifulsoup to the rescue! Unfortunately, the html is not structured in a very friendly way when it comes to parsing out the tags of interest. However, after brief analysis, I figured out most of the rules to apply and was able to dump the data of interest out. If Amsat decides to change the structure of their satellite status pages, this code will need to be adjusted.
import types
import urllib
from BeautifulSoup import BeautifulSoup

# strip off all the satellites summary title tokens, like oscar designation,
# oscar number, etc...
def GetSatelliteSummaryTitles(soup):
tds = soup.findAll('td', {'align':'right','valign':'top', 'nowrap':None})
titles = []
for td in tds:
#print td
b = td.find('b')
cleantitle = b.contents[0].replace(' ','').replace(':','')
titles.append(cleantitle)
return titles

# strip off all the satellie summary value tokens, like AO-51, 11.140 Kg, etc..
def GetSatelliteSummaryValues(soup):
tds = soup.findAll('td', {'valign':'top', 'nowrap':None})
values = []
for td in tds:
#print td
b = td.find('b')
if b is None:
cleantitle = td.contents[0]
values.append(cleantitle)
return values

# return a dictionary containing mode data
def GetModeData(soup):
tds = soup.findAll('table',{'width':'75%'})
tds = tds[0].findAll('td')
i=0

currentmode = None
modedata = {}

for td in tds[1:]:
if not ' ' in td:
if not ('valign','top') in td.attrs:
b = td.find('b')
span = td.find('span')
if (not b is None) and (not span is None):
if (not 'Broadcast:' == b.contents[0]) and (not 'BBS:' == b.contents[0]):
currentmode = b.contents[0]
if not modedata.has_key(currentmode):
modedata[currentmode] = []
else:
if 'Callsign(s)' in td.contents[0]:
continue
else:
cleanmode = td.contents[0].replace(':','')
modedata[currentmode].append(cleanmode)
i+=1

for key in modedata:
for i in xrange(len(modedata[key])):
item = modedata[key][i]
if type(item) == types.InstanceType:
modedata[key] = modedata[key][0:i]
break

return modedata

def AppendStatusToFile(outfile,summaryMap, modeMap):

# write all the summary data
for key in summaryMap:
outfile.write(key + '\n')
outfile.write(' ' + summaryMap[key] + '\n')

# get all the mode data
modeMap = GetModeData(soup)
for key in modeMap:
outfile.write(key + '\n')
for mode in modeMap[key]:
outfile.write(' ' + mode + '\n')

outfile.flush()


if __name__ == '__main__':

output = open('allstatus.txt','w')

for i in range(1,200):

try:
urlToProcess = 'http://www.amsat.org/amsat-new/satellites/satInfo.php?satID=' + str(i) + '&retURL=/satellites/status.php'
output.write('=== STATUS URL:' + urlToProcess + '\n')

url = urllib.urlopen(urlToProcess)

pagedata = url.read()
print 'processing url: ' + urlToProcess

filename = str(i) + '.html'
open(filename,'w').write(pagedata)

soup = BeautifulSoup(pagedata)

# get all the summary data
titles = GetSatelliteSummaryTitles(soup)
values = GetSatelliteSummaryValues(soup)
summaryMap = dict(zip(titles,values))

# get all the mode data
modeMap = GetModeData(soup)

AppendStatusToFile(output,summaryMap,modeMap)

except Exception, e:
print 'ERROR processing: ', urlToProcess
print 'ERROR details: ', e


output.close()


For the time being, I plan on using a template engine to format this data into an xml file that will be used as an RSS feed, this will be provided to the Amsat-bb list to see what they think.

If you have any comments or suggestions, let me know.

Cheers,
Joseph Armbruster
KJ4JIO

Saturday, April 11, 2009

Friends

This goes out to all of my friends.

Thursday, March 12, 2009

Google Earth Panoramas Posted

Checked out Google Earth tonight and my panoramas are officially visible. Awesome!! If you browse around campus you can find them but you have to be zoomed in reasonably well. I'm going to try to fix that tonight. If you're anxious to see them, you can cheat and visit my Panoramio page.

Enjoy!

Saturday, February 21, 2009

UCF (University of Central Florida) Panoramas

With the wife being out of town this weekend I decided to hit the streets. I had a field day with my Kodak P850 around the UCF campus. The goal; to make a bunch of panoramas. My camera has been giving me all sorts of headaches and i'm convinced it is nearing the end of its life. I'm disappointed because the camera seemed to be good at first. I had to take it to Best Buy once during the first year because it started turning off by itself. Now a days, both the evf and lcd displays flake out randomly, and by flake out I mean go completely black or white depending on the amount of brightness in the field of view. Every few minutes I have to play with the zoom in order to get the displays to even show up. It's a pain and I haven't even had the camera more than two years. Amidst all the fighting with it, I ended up taking some reasonably good shots around campus. An added bonus was the absolutely amazing weather we have today! Oh, for the record, all the photos here were taken with the camera mounted on my tripod.

I started off entering campus and driving through the parking garage closest to the education building. I went all the way to the top and thought I would start by taking some shots around the outside edge of the parking garage. If I were to start, I would have had to go all the way around (since i'm ambitious)... but since i'm lazy I decided to just take one of the garage itself as it looks from the interior. All together there was a total of 19 shots.

This was my first experience using a tripod to create a panorama. I learned a few lessons along the way. For the other panoramas i've taken I simply held the camera in my hands and used myself as the tripod. All in all the tripod was a great idea, just a pain to carry it around everywhere, not to mention my tiny camera doesn't really Look right on such a large tripod. Seeing that it was roughly 50 degrees outside here in Florida, the next stop was appropriately Starbucks. Of course, I was too excited about taking more shots, so I didn't grab any caffein. Although, just typing about it makes me Crraavveee one. Total shots here was 13.

I then took a stroll to the UCF Student Union, looking for good spots. It looks like they are having elections coming up, so there were banners and tents all over the place outside the front. There were also quite a few people walking by. It didn't seem like a good spot, so I decided to take a stroll behind the union, where there were less people and other things. This series was made up of 14 shots.

Campus is huge and I wanted to take a shot or two near the towers, so I started walking back to my car. I made a quick stop between the administration and education buildings and centered my shot on the Teaching Academy building.

Then I hit decided to hit the car and drive to the towers side of campus. That was a pain, since I left my car on the top of the parking garage (dumb idea). So, drive I went to the other side of campus with the windows down. The towers side of campus is full of parking meters and I wasn't sure if I had to pay or not on the weekends. I asked a lady in an SUV and she said 'probably'. I decided to park in a visible spot and not pay, just in case I didn't have to :-) There's this flat median-like area in the street that is most likely meant for students to gather on when trying to make the leap of faith across the street during busy days. It was an excellent spot to set up camera and start snapping, so I did; all 14 shots worth.

All together it was a reasonably productive outing. I learned a bit more about my camera, using a tripod in different places and enjoying the great out doors. Before leaving campus, I had to make one last stop, Lake Claire. There was a family out in the picnic area on the west side of the lot, so I decided not to point my camera in that direction. They were playing some game where you throw these colored balls into the sand... it was strange, i'd never seen anyone playing it before. I should have asked what it was all about but I didn't. Instead I took 17 shots of good ole nature.


There were two other groups of images that I did not show here; one set on the first floor of the library and one of a line of vending machines near the starbucks. Both were failed attempts in stitching but were definitely good learning experiences.

If you're interested in obtaining higher resolution versions of any of these images go to my homepage joevial and click on panoramas. Blogspot has a limit on the amount of data that can be uploaded per blog, so I host them up separately.

Enjoy!

Sunday, February 8, 2009

STS-119 Discovery Panorama

I took a trip to the Kennedy Space Center today with some family and friends. It's been years since i'd been to "The Cape" (as it's known to Floridians) and it's still really fun! I was like a kid at the fair the entire time. I didn't know which way to look or what to take a picture of next. Since I live in Orlando, I decided to purchase a year pass. I'm definitely going again soon, in fact i'm going to try to take a trip next weekend (or in the next couple of weeks) to make some more panoramas. I took several photos off the LC-39 Observation Gantry in an attempt to create some panoramas from them. For those of you not familiar with this observation deck, I took a nice shot for you.

I don't remember seeing this when I was a kid but it was completely amazing to me. I'd like to say Thank You to NASA all the taxpayers and for making this available. This platform provides an excellent view of a large portion of the cape. In addition, on each deck, they have observation guides that give you details as to what you're looking at out in the distance. If you look near the center of the picture, you see something that looks like an engine. Don't doubt yourself, that's exactly what it is! It's a main engine from the shuttle. Read the brief about it below.


What's really neat about the way they set it up is that you can get a view of every angle of it; top, sides and bottom, all up close. Here's a quick shot of what it looks like from the side.

I took a circle of pictures all the way around, looking down from the top and around the sides. On the ground, I laid my camera on the floor and put the ten second timer on to capture the whole thing. Ok ok, I got sidetracked, so back to the panoramas. Of course, without software like Photoshop, it's rather difficult to touch up the pictures in Windows. So, I went ahead and did what I could to clean them up in Paint .NET since i'm on a Vista box at the moment (I know, I know... GAG ME). Off to the east, I caught a glimpse of Discovery out on launchpad 39A and thought this would be an excellent snippet to share with the world. For those of you that did now know, the Discovery vehicle is preparing to launch at the end of this month. To learn more about the crew and mission, read more here.

If you look out to the south east, you can see the LC-41 Atlas V launchpad, which is still active.

If you're interested in viewing some of the high-res photos, you can obtain them here: Atlas V Launchpad Panorama Discovery STS-119 Panorama.

I have a set of photos that cover the view of the Observation Gantry full circle. Unfortunately, I was unable to completely 'panoramorize' (?) them. I have TONS of other photos of just about everything you could possibly see. Before leaving the park, we were fortunate enough to sit in on a presentation given by an astronaut namely Roger Crouch. Before this presentation, i'd never heard of him before. He gave us details on his background and how he came to be an astronaut. He also provided a rather gory look into some of the things they do while preparing astronauts for the launch, such as:
  • tagging your body parts (for identification purposes if necessary)
  • giving you an Oxygen tank (in case you go unconscious for whatever reason)
  • placing shark repellent in your suite (to keep those away if you become dismembered)
Sign me up! I'd be too busy jumping with excitement to be worried about anything. That being said, this guy was full of interesting information and experiences. He spoke for about an hour (approximately, I failed to note the time). One fact that he focused on was his color-blindedness and how it affected him throughout his life. For starters, he wanted to be a pilot and the services had to turn him away because of this. He spent years writing letters to NASA in an effort to become an astronaut and was turned down over and over. Finally after years of trying, he was recruited as a Payload Specialist (which he seemed to portray as his "work-around" to becoming an astronaut). Around halfway throughout his speech, I saw a few groups of people just walk out. One was a group of three people and the other a group of four. I was burning with anger. I have no concept of why people would just get up and walk out like that. I mean, how many astronauts have you been able to ask questions to and take pictures with? There were maybe 20 people in the room total, such a great opportunity! So, enough rant... The one idea that he stressed throughout his presentation was that you should Never Give Up. It took him years to become an astronaut and if he wasn't persistent it probably wouldn't have happened. All in all, I feel honored to have had the opportunity to sit down and listen to what he had to say. I'm proud to say I know who Roger Crouch is and he is definitely an inspiration to me. After all was said and done, my wife and I took a picture with him :-)


All and all, today was an excellent day. I'm definitely going back soon. Two thumbs up NASA!

Saturday, February 7, 2009

joevial - The Return

Joevial is back online and smaller than ever! Due to the cost, I had to abandon my rental server from LayeredTech. That was quite a few months ago and since that time, my web presence has been minimal (except for this blog). In order to maintain my web presence I have resorted to taking extreme measures:


The website is currently powered using a 500MHz AMD Geode. I have 8Gb of compact flash onboard but nowhere near that quantity of content. Don't forget to check it out: www.joevial.com

Thursday, February 5, 2009

Technicians License - Finally

Yeah, yeah, I know... I have no excuse for not doing this long ago... Some of the guys at work were talking about some radio stuffs this week and I had to jump into the conversation all excited. I went home and decided I was going to quit being lazy and become a Ham. I called up the OARC and scheduled to take my technicians exam. I left work 30 minutes early yesterday to ensure that I would arrive on time. It was 36 questions all centered around electronics, radio and regulations. There were quite a few hams there and they had this really neat presentation on DSTARS. I look forward to having my operators license and getting involved in the community.

Speaking of community, I spent this evening over at a LMARS meeting. They had some high school students give a presentation on a robot they competed with. It was a pretty neat little bot that had to pick up hockey pucks from one location and drop them in another.

In other news, i've been doing more testing with my XTend-PKG modules. Below, you can see my base unit (top) and the portable unit (bottom).


I wanted to do some testing, except I can only be in one place at a time :-) I could have used the X-CTU but I didn't want to lug my laptop around. Soooo.... I wrote some quick code to send/receive the packets and test them for completeness. The code would validate the packets then it would text my phone the good/received reading every 2 seconds. For the record, I used C# for this code, so System.IO.Ports to the rescue! This way, all I had in the car with me was the portable unit and my cell phone (which I would have with me anyway). At 500mw, I was able to get just over a half mile out of it. That's with an apartment building in the way too, so it wasn't too shabby. I'll probably do some testing at UCF this weekend using the parking garage to place the base a bit higher up....

While putting together this blog, i've been shaking at my desk... not to mention my nose is cold to the touch. I just looked at my AccuWeather powered receiver and it's 38 degrees!

Well, I need to warm up... For now, i'm off to bed.

Oh, Call sign coming soon!

Wednesday, February 4, 2009

New Flight of the Conchords Season

Season two is here. Thanks for the reminder Dan!

Learn more about em here: http://www.hbo.com/conchords/

Looking forward to their concert in April!

Tuesday, February 3, 2009

Log4Net - Configure at Runtime to Log to Textfile

I spent a few minutes hacking at log4net today. It's been a very long time since I used it and i've always used it along with an app.config. Today however, I attempted to use it for other purposes that required specifying the log4net configuration parameters at runtime so that output could be directed to a text file. I'm not a log4net expert, nor do I claim any expertise in this area. That being said, this seemed to work for me.

I created a FileStream and TextWriter to start things off. The LOGFILEPATH is a string that indicates the full path to where the log file will be stored:

FileStream filestream = new FileStream(LOGFILEPATH, FileMode.Append, FileAccess.Write, FileShare.ReadWrite);
System.IO.TextWriter textwriter= new StreamWriter(filestream);
Next, I looked into the PatternLayout class for an example pattern to use for log output. I found a good reference online here.

log4net.Layout.PatternLayout patternlayout = new log4net.Layout.PatternLayout();
pattern.ConversionPattern = "%timestamp [%thread] %level %logger %ndc - %message%newline";

I then created an associated TextWriterAppender. This is where we will associate the TextWriter and PatternaLayout together into a bundle. I also set the ImmediateFlush option to true.

log4net.Appender.TextWriterAppender textwriterappender = new log4net.Appender.TextWriterAppender();
textwriterappender.Writer = textwriter;
textwriterappender.Layout = patternlayout ;
textwriterappender.ImmediateFlush = true;

Unfortunately for me, I first started this endeavor by creating my own Appender... which wasn't too bad but after browsing through the source tree and checking out src\Appender, I figured i'd just use theirs. So I did ! To finish it all up, I simply initialized the BasicConfigurator with the created textwriterappender.

log4net.Config.BasicConfigurator.Configure(textwriterappender);
I also needed to set the debug level. After some mild google-ing, I did this:

((log4net.Repository.Hierarchy.Hierarchy)log4net.LogManager.GetRepository()).Root.Level = log4net.Core.Level.Debug;
So, there you have it. After this, you simply use log4net as you normally would in your code. For me, it was as simple as:

log4net.ILog log = log4net.LogManager.GetLogger(typof(MYCLASS));

Where, MYCLASS is the class I was interested in logging. I'm not sure if this is the correct way to go about doing this but it appeared to work for me. I'll need to spend a bit more time reading through the source and learning the 'right way' around the library. It's interesting to note that within log4net-1.2.10/src/Appender, there are several other types of Appender classes defined for use, including AdoNet, Console, Telnet, Udp, etc.... Good to know in case you ever are interested in heading down that road.

If you know of a better way to perform this type of runtime configuration, please let me know. If not, hopefully this works for you!

Cheers!