A chump bot (http://www.w3.org/2001/09/chump/) sits in an IRC (Chat) channel and remembers any URL (web addresses) that people say. It displays them on a web page for later reference. I spend time in #infoanarchy on the freenode network (freenode.org) where someone runs a chump bot whose output is visible here: http://peerfear.org/chump/
A wiki is website anyone can edit. Every page has an edit button on the bottom which anyone can press to edit the page. They grow organically and are great for group collaboration. Some friends and I set one up and track plan most of our group activities using it. The most famous wiki is http://c2.com/cgi-bin/wiki?WikiWikiWeb
I wanted to combine these two resources so that anything said in the IRC channel my friends and I chat in was recorded on the wiki we share. I sat down and Perl script to do just that and it took surprisingly little time. Yay for the HTML::Form module. The output can be seen here: http://www.arioch.org/phpwiki/index.php/WikiChump
Attached is a tarball containing that script.
Some friends and I just threw a huge party with a Dystopian future theme. I wanted to have a hand scanner at the door because biometrics scare the hell out of me. I started out with grand plans involving laptops and real scanners and all sorts of things like that, but as time drew short I resorted to trickery.
We ended up with a stainless steel cylinder (trash can). Atop it was supposed to be a black glass sheet against which palms could be pressed, but I accidentally shattered that while working on it the night before the party. I ended up using some black foam core board with a white palm on it that looked okay.
When someone pressed their palm against it the 'accept' light glowed and a pleasant ding noise was heard. If, however, we were pressing the hidden, remote button they'd be denied. Denial just meant a different bulb and an unpleasant buzzer.
What's funny is I didn't use use any electronics knowledge past what I learned reading The Gadget Book by Harvey Weiss when I was in the second grade. Since then I took three years of electrical engineering, but none of it had half the impact of that silly little book.
I don't know if anyone took a picture of the finished scanner, but I snagged the schematics as penciled on my mimio white board.
I run a lot of mailing lists on mailman, http://www.list.org/, servers. Most all of these lists are configured so that only list subscribers are allowed to post messages. I do this because these lists get a lot of spam messages that I don't want to get through to all subscribers.
Unfortunately, when a non-subscriber posts they're not automatically rebuffed, but instead I, as the mailing list administrator, get an email asking if I want to approve their message anyway. If I don't answer that question I get get a reminder every 24 hours. The reminders can be turned off, but there are some of mailman's questions that I do want to have brought to my attention (ex: subscribed posters who have exceeded maximum message size, etc.).
What I wanted was a way to configure a mailman list so that non-subscribers get a message explaining why their post isn't being accepted without me having to go click 'reject' on a web form. I started to add this feature to mailman, but that wouldn't really wouldn't help. I can't get source forge or my company to upgrade to the newest version of mailman even if my features gets accepted, and those are my lists that get the most spam.
Instead, I wrote a filter that catches email indicating a non-subscriber user has posted to a list and automatically goes and clicks 'reject' on their message. I've got the auto-clicker coded up pretty carefully such that any pending requests that aren't non-subscriber posts won't get auto-rejected. Also, if there's any sort of error in the process the initial notification message is allowed through.
The whole thing fits into a nice tidy Perl script. It's invoked via procmail and requires the excellent LWP suite of modules available from CPAN. The script is attached.
I got permission from Derek Tonn at tonnhaus design to use his map on the site, and I got the new site fully setup at http://mpls-watched.org. With all that done I figured it was time to send out press released and fired them off to the Strib, City Pages, Rake and Skyway News. Who knows, maybe one of 'em will run something.
When I wasn't sure if I'd be able to use the tonnhaus map, I was trying to figure out ways to make my gathered location data still useful. As mentioned I took some GPS points to test the theory that the map was to scale. I then marked those same four points on the tonnhaus map and calculated the X and Y pixel/degree ratios for each of the six ( (4-1)! ) runs.
If the map was perfectly to scale, and my GPS was perfectly accurate, and my point selection on the map was a perfect correlation to where I stood when taking the GPS points the horizontal and vertical pixel/degree ratios would be the same for all six pairs of points. Unfortunately, they were way off. I'd've written the map off as not to scale if it hadn't been for how very off the ratios were. Not only were they of wildly different magnitudes, but some of them even had different signs. That shouldn't be possible no matter how skewed the scale was.
I spent a good hour puzzling out how my calculations could be so far off when it hit me. Minneapolis isn't built on a real north/south/east/west grid. It's horribly skew. I'd made the rookie mistake of assuming top of the map is always north. I got out a protractor, make a quick measurement, rotated the tonnhaus design map 22 degrees, re-picked my points on the map, re-did my math and found nice reasonably similar rations. After I though out the shortest pairs between points (as short runs maximize error) I got the percent standard deviation for both the horizontal and the vertical down to small enough values that I think converting points of the digital map to latitude/longitude coordinates will be sufficiently precise to make my data portable. Whew.
It took most of a weekend to do it, but there's now a nice website for the Minneapolis Surveillance Camera Project at http://sarinity.com . I'll be moving it to its own domain eventually, but that'll be a week or so.
The look is entirely owed to the Open Source Web Design site, http://oswd.org. I love being able to just go snarf a well coded template for a new project. Those people are doing a real service.
The meat of the new site was done in Perl by myself. One can now view camera locations, information, and pictures, report cameras, and upload photos of cameras.
I heard back from the Derek Tonn of tonnhaus design about using the map, and he's understandably interested in seeing how the project comes out and what it's about before he provides the tacit approval implied through the use of his base map. If I need to switch over to another map it shouldn't be a hassle, I just despair finding one as pretty as his.
Update: I've shut down this site.
I got the surveillance camera location reporting stuff working tonight. It's amazing how easy Perl's CGI.pm can make stupid little web input forms. I'm sure I'll think of some other fields that I want in the data before this thing goes live, but for now this should do nicely: https://ry4an.org/surveillance/report/
The map I'm using is nice, but doesn't include all of downtown, and I still haven't heard back from its creators about getting permission to use it. Since I might have to change maps in the future (or might want to expand project scope) I'm hoping to store the camera locations as GPS coordinates rather than as useless pixel locations.
Toward that end I walked to the four corners of the map tonight while taking GPS readings. I'll import the data later and see if the map is to scale in addition to being aesthetically pleasing. If it is, extracting latitude and longitude data will be a snap given that the spherical nature of the earth can be ignored for areas a small as downtown Minneapolis. If it's not, I'll probably have to find a new map before too many cameras get entered.
Next steps are:
None of those should be terribly hard. Who knows within a month or so this might actually be a project worth looking at.
Update: I've shut down this site.
Target Corporation is donating a quarter million dollars to the city of Minneapolis, which city council rapidly accepted, to install 30+ police monitored security cameras. I'm not able to articulate why stuff like this scares me as much as it does, but I just get a queasy feeling when I think of government surveillance of the citizenry.
The ACLU has found that police cameras do not yield any significant reduction in crime, and there are many documented instances where police cameras have been used to illegally and inappropriately infringe on the privacy rights of citizens. That said, I think keeping camera counts down is a losing battle. Most people just can't get worked up about privacy rights in general and security cameras specifically.
The New York Surveillance Camera Project (http://www.mediaeater.com/cameras/overview.html) has produced a great map (http://www.mediaeater.com/cameras/maps/nyc.pdf) of all the thousands of Manhattan area cameras they could find. I'm looking to do the same thing for Minneapolis. I guess the hope is that people will be started when they see how frequently they're recorded and will at least think next time the government wants to put up more cameras. Who knows maybe given enough time we can even set up a least-watched route finder like the people at iSee have (http://www.appliedautonomy.com/isee/).
For now all I've done is define an XML format for representing camera information (https://ry4an.org/surveillance/camera.xml and http://ry4an.org/surveillance/camera.dtd). The next step is to get a nice map of downtown Minneapolis (hopefully from tonnhaus design: http://www.tonnhaus.com/portfolio/city_area.htm) and create an image map and corresponding CGI form so friends and I can start entering locations. Lord only know when I'll have time for that.
Update: I've shut down this site.
At Onion Networks our CVS repository has a lot of symlinks that need to exist within it for builds to work. Unfortunately, CVS doesn't support symbolic links. Both subversion and metacvs support symbolic links but neither of those are sufficiently ready for our needs, so we're stuck with creating links manually in each new CVS checkout.
Sick of creating links by hand, I decided to write a quick shell script that creates a new shell script that recreates the symlinks in the current directory and below. A year or two ago I would have done this in Perl. I love Perl and I think it gets an undeserved bad wrap, but I find I'm doing little one-off scripts in straight shell (well bash) lately as others are more inclined to try them out that way.
Doing this in shell also gave me a chance to learn the getopt(1) command. Getopt is one of those things is you know is always there if you need it, but never get around to using. It's syntax sucks, but I'm not sure I could come up with better, and what they've got works. While writing my script I kept scaling back the number of options my script was going to offer (absolute, relative, etc. all gone) until really I was down to just one argument and could've put off learning getopt for another few years. O'well.
Once I'd written all the option parsing stuff and started with the find/for loop/readlink command that was going to print out the ln commands, I noticed that by using the find command's powerful -printf action I could turn my whole damn script into a single line. At least my extra wordy version has an integrated usage chart.
Here's the one line version:
find . -type l -printf 'ln -s %l %pn'
Attached is my script that does pretty much the same thing.
Road Rage Races are an idea I came up with a few years back that I'm trying to resurrect. I've updated the website (https://ry4an.org/rrr), and tacked on a new tag line: "Light travels at 299,792,460 m/s. Immaturity travels at 5 mph."
In a Road Rage Race the competitors start out in a centrally located parking lot in the Twin Cities area. They then race to one of five previously agreed upon destinations selected randomly at the time of the race start. The hitch being that this is done during the height of the evening rush hour keeping top speed in the 10 to 20 mph range.
Particular fun could be had if multiple types of vehicles can be coaxed into participating. I'd love to see folks on bike vs. foot vs. car vs. bus vs. motorcycle. I tried to get one of these organized in 2001, but it's hard for everyone to get out of work early. Maybe a Friday or Saturday night in the busy downtown area would work as well.
What's nice now is that consumer grade GPS devices have come down in price significantly. Many of the people I'm trying to cajole into playing already have them. Their position tracking features will allow us to record where each car is at each second. After the race is over we'll be able to create a detailed replay with almost no effort and great accuracy.
Comments
I'd be up for it, could be a lot of fun. However, don't we all need to get hopped up little sports cars ala The fast and the Furious?
haha, just kidding about that...
-- Louis Duhon
I'm trying not to have all the projects and ideas posted to this list be computer related, but I guess that's where I expend most of my creative energy. I bought a Mimio electronic white board (http://mimio.com) cheap on eBay ($40), and while the Windows software for it is reported to be quite good, the Linux software options ranged from vapor to unusable. I did, however, find some Perl code that handled protocol parsing (the tedious part), so I started with that.
The white board part of it was largely uninteresting, but one fun problem cropped up. The Mimio infrequently reported false pen position data that caused the resulting image to have some terrible lines across it. An example of the out with the bad data can be found in the attached unfiltered.png image.
To filter out the bad points I started by tossing out any line segments that were longer/straighter/faster than a human should reasonably be drawing. Essentially I was taking the first derivative of the position data and discarding anything with too high a speed.
Always one to go too far I modified my filter so it could take Nth order derivatives. I ended up configuring it to take the first four derivatives of position (velocity, acceleration, jerk, jounce[1]). I could've set it to 100 or so, but I figured with the time resolution I had I was already pushing it.
I experimentally arrived at threshold levels for the scalar magnitudes of each derivative using the ophthalmologist technique ("this or this", "this or this"). The end result for the same image can be viewed in filtered.png. The missing lines that should be there correspond to when I didn't press the pen down hard enough and are actually missing in the unfiltered image as well. It's still not perfect, but it's good enough for me, unless someone else has a cool filtering idea I can try.
I've attached the tarball for the software for posterity. If you're using it email me first -- I might have a more current unreleased version.
[1] http://math.ucr.edu/home/baez/physics/General/jerk.html
This work is licensed under a
Creative Commons Attribution-NonCommercial 3.0 Generic License.
©Ry4an Brase | Powered by: blohg 0.10.1+/77f7616f5e91