I was recently motivated to run WorldWind on the OUYA. After a bit of research and some evening hackery, I was able to create a basic Earth-viewer activity.
I also hooked-up the controller to allow the dpad to pan and L1/R1 to zoom in and out. It's all rather good! My next goal will be to use the AXIS_LS_X/Y for panning. I already have it hooked up but I need to work on smooth transitions. More to come on this, for now you can enjoy this short video:
My OUYA arrived in the mail yesterday. Tonight, I dug through the manual and was reasonably surprised to find that no license agreement was provided. There is a section of the manual titled "The License Agreement" but it's interesting, read below:
This is a close-up of the right hand page.
This is a close-up of "The License Agreement" section.
When I browse to http://www.ouya.tv/support/license right now, this is what I see:
Well now, that was unexpected. I tried using the support forum (https://ouya.zendesk.com/home) to search for "license" and this led me here:
While searching around the internet, I discovered that many others are interested in obtaining the linux kernel source for the OUYA, see:
...and sending an inquiry to OUYA support with the following request:
'''I received my OUYA yesterday. Tonight, I was reading through the manual and decided to read through the license agreement. The resource www.ouya.tv/support/license does not appear to exist. I would like to request a copy of the license agreement for my unit, model: OUYA1
Within the console, there is a Legal Information section that details the following:
- Android Open source licenses (licenses for libraries used by OUYA software)
- OkHttp by Square, Inc.
- Picasso by Square, Inc (software added by OUYA)
- iPerf by The University of Illinois
The problem with these is that my unit had received several updates once it was connected to the internet. I am mostly interested in obtaining the license agreements for the software that was originally distributed to me. I guess we'll see what happens!
I generated a volumetric display of myself using a Kinect. The sensor is on the table directly in front of me, roughly a meter away. It is pointing up at me, hence why you see the shadow-void behind me.
I recently became interested in creating 802.11 control and management frame plots around Orlando. Naturally, the first problem was figuring out how I could capture GPS coordinates on my laptop. Being GIS savvy, I can think of a few ways to do this but what fun would my blog post be if I took the easy way out? The reality is, I have a Lassen GPS unit that would get the job done, easily. Unfortunately, I am not interested in carrying the unit in an out of the car for data collection sessions and I am most-definitely not going to be leaving it in my car to bake in the Florida heat. Instead, I took an alternative approach and used my MyTouch 4G Android powered cellphone as a USB accessory. A few lines of JAVA later, I was streaming lats and longs to my Macbook Pro via USB. Here is what my GPS coordinate streaming app looks like (in all of it's glory). Notice how I am strictly adhering to the Android UI design guidelines. When you press the 'Start' button, GPS coordinates begin to stream to the host.
If that screenshot did not keep you interested, maybe the next topic will, Frames! The pipeline I used to capture the 802.11 frames was rather simple, it boiled down to:
At this point in time, the shapefile contains control frame addresses and the capability information field for privacy (B4 as defined in section 18.104.22.168 of the 802.11 specification). In the plots below, each point represents a single GPS coordinate collection event. Green points represent private networks while red points represent public (aka guest) networks. Each point may contain multiple 802.11 telemetry frame details. This is due to the fact that I am not dead-reconing a unique coordinate for each telemetry frame based on the delta T between the last GPS coordinate epoch and the telemetry frames epoch. Doing this would get messy rather quickly, so I just kept it simple. Here is a plot of several telemetry frames that I captured while driving through downtown and down 408 (and back).
Below you can see a closeup of the frames that were collected around the 408 / 417 interchange. It is important to reiterate that there are many points in each spot, where each record represent an individual frame.
Let's zoom in a bit closer and examine one of these sets of frames. Below, I am selecting the rightmost set of frames (the ones highlighted in yellow). Within the attribute table, you can examine the described control and management frame data.
I slowed my vehicle down while exiting the 408 and as a result my frame density increased. The density continued to remain high as I drove throughout downtown Orlando.
Here is a closeup of some of the frames I collected around downtown (near lake Eola). See how many GPS anomalies you can find.
Below, you can observe another selection of telemetry frames. Notice the number of records in the attribute table and the number of selected frames.
I intend to expand my data set and create some more interesting plots in the future. For now, the most interesting plot I can provide is this shot, showing all management frames that were collected with an SSID of DefaultSsid2-1.
This evening, I needed to generate a custom map of antenna locations for the great State of Florida. I snatched some data from the FCC and was quickly on my way. I decided to use state borders, county borders and roads as my reference features (note: no hydro). You can see the neat border affect I produced by combining the county and state borders.
You can clearly identify your relative location within the state, despite the road networks being mostly translucent. You can also clearly identify antenna locations using the legend.
As you enter lower scales, roads become well-defined and antenna locations are more precise.
At this point, roads have well defined widths and antenna locations are labeled using their call sign.
Let's face it, most county and state map services have serious problems. They are notoriously slow, offer little in the way of filtering and are nearly impossible to customize. If you're lucky, you can toggle layers on and off but not without a 1-5 second delay. Annoying! No-one wants to see an hour glass after they pan a map, right? For anyone trying to derive useful, business results, these services are not sufficient. This is where creating custom, beautiful maps from features comes in handy. You can carry them in your back pocket or share them with the world, the choice is yours. And because they can be customized on-the-fly (in software via a web service) the possibilities are endless!
I was provided with a collection of several shapefiles that contained many different features, including lakes, parks, conservation areas, parcels, etc... From this set of data, I need to automatically create customizable, interactive maps. Specifically, real-estate maps of Orange County Florida. Once finalized, the base features and styles will be loaded in a popular map service where the client will become empowered. Clients will access the map service using a web browser. The presentation will be driven by a popular slippy map control that will allow the user to query and style the data as desired. The best part is, that all changes to feature geometry and attribution will be automatically reflected in the map (Thank you Python + Mapnik). Here are some initial shots and descriptions.
At high scales, the most prominent features are visible, including; major highways (and other trans), conservation and hydro features, allowing a user to center on an area of interest.
As you zoom in, more prominent features become visible. Minor roads can be clearly identified and lakes are labeled. Notice how you can almost see the individual parcels.
After zooming in some more, you begin to identify individual parcels and boundaries.
Ultimately, assessed values can be identified by a color gradient
Below you can observe the layering affects with the prominent features (parcels, hydro, transportation and conservation)
Close examination of the parcels reveals their assessed values.
I added support for satellite footprints using a spherical earth model approximation. I added the bare-minimum code and configuration options required to do this. You can control footprint visibility and color using two new configuration options: show_footprints and footprint_color.
When show_footprints is true, each satellites footprint is rendered. Below is a snapshot of SRMSat with footprints enabled.
Ground station line of site indicators can be rendered simultaneously. Below you can observe two example ground stations within the footprint of SRMSat. Both ground stations are displaying their line of sight indicators.
After a few seconds, I took another shot from the top. Notice how the ground stations are right near the edge of the footprint and line of sight indicators are still enabled (as expected).
As the footprint moves away from the ground stations, we would expect to lose line of sight. Below, you can observe an example ground station right on the edge of the footprint.
Eventually, the ground station will sit right outside of the footprint and line of sight will be lost.
Now that basic footprint functionality is complete, we have many more possibilities. The next step could be to plot collections of ground station networks and plan telemetry collection schedules. But before doing this, I am going to finish implementing the same type of visualization but for ground station visibility.