VT Campus Map - As part of a Multimedia Class, I worked
in a group of three with University Relations to develop a new Campus
Map for Virginia Tech. The creation of new campus map had been demonstrated
to be a need. With the previous version, any time a new building was added
to the campus (which is quite frequently), many maps would have to be
updated since the landscape of the local area would be changed by the
new building. This would often affect 20 or even more maps. Unfortunately,
the sheer volume of work, along with the fact that the office had many
other things to deal with, would lead map stagnation. My team and I were
able to develop a solution using PHP and Postgres that would dynamically
create maps. This meant that whenever a new building was erected, the
building coordinates would be entered along with identifying information
(name, abbreviation, etc.) and a single master map would be updated. All
other maps would now automatically include this new building without a
single change being made to them. User tests were well received and those
that could compare our new version to the old one greatly appreciated
the new functionality we had included. Disappointingly however, there
were several complications that prevented the map from going live after
completion. Source code available upon request. [Demo] [Powerpoint Presentation on the map (836
KB)] [Final Report (MS Word)(2.47
MB)][Table of
Contents and Abstract (MS Word)]
GIS Programs - These programs were all created for
my Algorithms in GIS class in partnership with one other student:
Assorted Small Programs - These were mainly beginner
programs that did things like calculate Great Circle distance from
one geographic location to another and build header files for DEM
data sets.
Raster Program - This was the program that I most
enjoyed developing. The user would specify a DEM file and from that
file several surface analyses could be run that I programmed. One
could compute the slope of all locations on the DEM and graphically
represent it. Another analysis would compute what direction a particular
area was facing by evaluating the surrounding cells. The cells would
then be shaded, allowing the user to visualize the geographic orientation
by using a helpful color ramp. Both the aspect and slope data would
be combined (along with user input specifying the sun's location)
to create a hillshading map. The program could also do other impressive
things, like interpolate what a surface might look like when given
only a few data points for it. For example, a land buyer could use
this program to figure out what portion of the land would be best
suited for a house (in terms of shade, gradient, and desired sun exposure). [Powerpoint presentation including screenshots
(1.18 MB)]
Voter Redistricting - This program is a sample
of what could be used in creating political districts. The user supplies
an ESRI shape file and a dBase file containing census information.
While the program is not intelligent enough to create tracts into
districts by itself, it does give user feedback on how close a district
would be to ideal majority/minority balance with selected tracts and
allows the user to add or remove tracts on their own.
Names Machine - This program is usable to locate
cities in the US with a certain name or partial name. The user would
input what city name ("Blacksburg") or partial city name
(ends with "burg") and the program would search through
the database of cities in the US, locate applicable matches and highlight
that location on the map. The most interesting result of this program
was to see how some city name endings would be concentrated in different
areas of the country ("borough" vs "ville").
Scatter Plot Matrix Visualization - I worked in a
team of three to program a Java component that would plug into a pre-existing
java visualization tool. The visualization took desired fields from a
database and represented them in an easily interpretable interface. The
screenshot shows how this can be used to compare various census statistics
to each other on a state by state basis (high school diploma vs. % per
capita income, population vs. median rent, etc.). [Screenshot]
Professional
Access to MS SQL Migration - One of the major applications
at the law office, where I work, was a rather large Microsoft Access application
that keeps track of all our clients, potential conflicts, timeslips, etc.
This database was getting rather large (100 megs) and was beyond the zone
where Access can comfortable manage it. Since our database designer had
come out with a front end that would interact with a SQL back end, we
decided it was time to upgrade. With the designer's assistance, we were
able to successfully transfer all database records into SQL without any
major complications. I was specifically in charge of setting up the SQL
server, configuring it, and assuring all information stored in it would
be included in our nightly backup.
$200,000 Technology Reconstruction - During May 2003
our office sustained major damage from a F4 tornado. Damage was so severe,
we were forced to relocate for what will be over nine months. Due to the
extent of destruction, my supervisor and I had to reconstruct our network
nearly from scratch. Physical-wise, at least we were lucky enough to retain
all data despite reliability questions regarding our machines. Even though
we had good backup policies in place, we didn't have to invoke them. Not
only was this able to save us time, but spared us the anxiety that is
inevitable when restoring mission-critical data. I single-handedly spent
well in excess of $100,000 on replacement hardware (from servers to switches)
and software. My supervisor spent an equivalent amount. Once all the purchases
came in, we were able to restore full services within an impressively
short time span. For example, a rudimentary wireless network was up within
days of relocation, allowing file and printer sharing. [Picture of me in server room after tornado] [Picture of wiring closet after tornado]
Access Database Replication - Before MS SQL became
a valid option for WAN database use, I had to figure out a way for outer-
office users to have access to our database in a user-friendly, cost-effective
manner. After performing quite a bit of research on data replication,
I was able to successfully implement it on our database so that everyone
would have a current copy of the data. Thereafter, all that became required
was a 30-minute daily call to keep data synchronized to all locations.
This 30 minutes was a vast improvement over the previous solution that
involved calls in excess of an hour from each office every day. Also,
this permitted outer-offices to have access to data at any time, a previously
unavailable option. I was hired as a consultant to deploy this solution
for another state's office as well.
Network Operating System Migration - Prior to the summer
of 1998, the network was running LANtastic with Coax cabling. The combination
of unreliable cabling and a non-enterprise level networking systems were
causing near daily downtime. After researching cabling options, I recommended
a 100Base-T implementation that was later installed. I also recommended
a switch to Windows NT 4.0. My supervisor and I were able to create all
user accounts/groups, implement a domain structure, and logically organize
all shared information in a weekend. This resulted in a nearly user-transparent
upgrade. The only major change noticed was a drastic uptime improvement.
Since then we have upgraded the servers to Windows 2000 and subsequently
Windows 2003. The Exchange server is also something I have kept current.
PHP/mySQL Application Development/Deployment - Our
website contains a large area of publications dealing with many issues
-- from fair housing to victims outreach to domestic violence assistance.
With well over 50 documents available and more being added, it was becoming
a bit too much for our webmaster to handle. I was able to create a mySQL
database that held information about all the publications, as well as
a PHP front end that would enable the administrator to simply type a name
and description for a file along with the location of the file on their
hard drive. The application would then upload the file to the web server
and enter the information in the right area. In turn, this would immediately
be available to the end user who pulled up the Publications page. The
descriptions would also be searchable since they are in a database. This
application is a great help to the users and administrators. [Administrator Manual (MS Word)]