Friday, September 30, 2011

Creating .NET Web Services for Firefox, IE, Chrome and Safari

starLoren on the Art of MATLAB
September 30, 2011 8:28 AM
by Loren

Creating .NET Web Services for Firefox, IE, Chrome and Safari

Guest blogger Peter Webb returns with another in an occasional series of postings about application deployment.

Contents

Calling .NET Web Services from JavaScript

Our story so far: I've deployed a MATLAB-based .NET web service using Builder NE and the Windows Communcation Foundation (WCF), and I've written a .NET client program that lets my users access that service. What happens when someone using a Mac or a Linux box asks for access? They can't run the Windows client -- but if they've got a web browser, I can provide them a client built with HTML and a quirky little language called JavaScript.

I'll demonstrate how to build a platform-independent web service client by extending a WCF-enabled deployed MATLAB component to support calls from JavaScript. My application has three parts:

  1. A deployed MATLAB component implementing a WCF-enabled type safe API.
  2. A WCF server publishing the type safe API via a service contract.
  3. A JavaScript client that requests service operations from the server.

The server requires WCF and must run under Windows, but the client can run in any browser that supports JavaScript and HTML 5 (and nowadays that's most of them -- except, ironically, IE).

Two words of warning here: first, this post builds on the others in the type safe API series, so you'd be well-served to read them first (at the very least, read the initial WCF post); and second, the complexity of client server applications makes it impossible to fully describe how they work in this (relatively) short article. I've included links for most of the jargon, I hope -- click on them if there's something you don't understand. That being said, I do encourage you to keep reading. You can likely download and build the application without understanding all the theory behind it.

Exchanging Data with JavaScript Object Notation

For the client and server to successfully communicate, they must agree on a data format. The defacto standard data format for JavaScript clients, JavaScript Object Notation (JSON), represents all types of data with strings of JavaScript code, sacrificing storage efficiency for readabilty and ease of use. Essentially, the server sends little program fragments to the client, which executes them to recreate the data objects. Since the client and server trust each other implicitly in this case, transmitting executable code creates no security risks.

In a previous post, I developed a WCF service that publishes an IFractal ServiceContract with a single operation, snowflake. Here, I extend IFractal to support JSON output by making two changes. First, I add the [WebGet] attribute, specifying the output format WebMessageFormat.Json:

[ServiceContract] public interface IFractal {     [OperationContract(Name = "snowflake")]     [WebGet(ResponseFormat = WebMessageFormat.Json,             UriTemplate = "/?n={n}&width={width}&height={height}")]     FractalOutline snowflake(int n, int width, int height); }

UriTemplate defines a pattern determining how to map named web request parameters to snowflake's inputs.

The original snowflake returns two values: the first in the function's return value and the second via a C# out parameter. But my client invokes the service through JavaScript's XMLHttpRequest API, which only permits a single return value. Hence, the second change: returning the two pieces of data in a single structure. To enable WCF to automatically marshall the sructure, I decorate the return type, FractalOutline, with the [DataContract] attribute, and each of its fields with [DataMember].

[DataContract] public struct FractalOutline {     [DataMember]     public int[][] vectors;     [DataMember]     public int[] bbox; }

With these changes, the service will send a block of JSON code in response to HTTP requests.

Hosting the Service

I host the KochJSON service in a Windows console application, configuring its HttpBinding endpoint to publish the IFractal contract using the webHttpBinding protocol. A webHttpBinding endpoint listens for requests that use XML rather than the more complex SOAP, greatly simplifying the client code. I add the new endpoint to the application's configuration file, App.config:

<endpoint        address=""        binding="webHttpBinding"        bindingConfiguration="webHttpConfig"        contract="IFractal"        name="HttpBinding" />

Clients making an XMLHttpRequest require an webHttpBinding endpoint.

A JavaScript Client

My client performs two tasks: requesting the outline of the Koch snowflake from the KochJSON web service and then drawing the snowflake in the browser window. IFractal's [WebGet] attribute defines the format of the URL serving the snowflake data. To retrieve the 4th iteration of the Koch snowflake, scaled to fit within a 300x300 rectangle, make the following request:

http://localhost:42017/KochJSON/?n=4&width=300&hieght=300

I've configured the service host to listen for client requests on port 42017 on my local machine. The string of parameters following the service name match the pattern specified by the UriTemplate I defined in the [WebGet] attribute of the IFractal [ServiceContract]. The parameter n here maps to snowflake's input n, and so on, and the service calls snowflake(4, 300, 300).

Making this request in JavaScript requires an XMLHttpRequest object, which you create with new:

var request = new XMLHttpRequest();

Call open to initialize the request with the HTTP method (GET), the HTTP address of the service, and true to specify the request should be made asynchronously. Then call send to make the request.

request.open("GET",    "http://localhost:42017/KochJSON/?n=4&width=300&height=300",    true) request.send(null);

The XMLHttpRequest object notifies me when the asynchronous request completes by invoking a callback function I provide. I convert the response data from its JSON text notation into a live JavaScript object by invoking eval:

var jsonOutline = eval( '(' + request.responseText + ')' );

My [DataContract] structure, FractalOutline, contains two fields, vectors and bbox. Since JSON data marshalling preserves field names, I retrieve the data from jsonOutline by referencing its vectors and bbox fields.

var outline = jsonOutline.vectors; var bbox = jsonOutline.bbox;

vectors and bbox are native JavaScript arrays, so I manipulate them using native JavaScript syntax. I draw the outline with a for-loop, at each step calling the HTML 5 canvas function lineTo:

x = x + outline[i][0]; y = y + outline[i][1]; context.lineTo(x, y);

There's a bit more code in the callback to manage errors and ensure that it doesn't start drawing until the entire outline is available, but it can't be more than ten lines or so. One line for data marshalling (the eval statement), a few lines to make and manage the request, but the bulk of the code I had to write myself focuses on solving the problem at hand, drawing the outline.

Building and Running the Example

Download the code from MATLAB Central.

The download contains the MATLAB function snowflake.m, a Visual Studio 2008 solution KochJSON.sln and an HTML page, KochSnowflake.html, defining the JavaScript client. Create the server program by following these three steps:

  1. Build the IFractal interface DLL.
  2. Create the Snowflake .NET assembly and the KochIFractal type safe interface.
  3. Compile the server program, KochJSON, after referencing IFractal and KochIFractal in the KochJSON project.

The client does not require compilation.

The file ReadmeWCFJSON.txt contains detailed instructions.

To run the example, open a DOS window for the server. Make sure the DOS window's runtime environment supports the execution of deployed MATLAB .NET components (you'll need the MCR on your PATH), then run KochJSON\KochJSON\bin\Debug\KochJSON.exe. When the server is ready, it prints a short message:

Koch JSON Snowflake Service started. Press any key to terminate service...

Activate the client by double-clicking on KochSnowflake.html. The first client to contact the server causes the server to load the MCR, which takes some time. However, subsequent requests process very rapidly. The server supports multiple clients -- try connecting from different architectures, using different browsers. HTML 5-capable browsers should display something like this:

Doing the Right Kind of Work

Retrieving the data and drawing the outline are domain tasks, central to the work I want to get done. Anything else, bookkeeping, data marshalling, managing the client server connection, is a distraction, an artifact created by the technologies I've chosen to implement my solution. Ideally, I'd like those technologies to manage themselves -- I'd like to concentrate on writing the code that solves the problem. And that's what deployed type safe APIs let me do -- with a native C# interface for my MATLAB functions, I can take advantage of .NET automation and application management tools that were previously inaccessible.

This JavaScript client required more effort than a WCF client generated by Visual Studio, but it is lighter weight and much more widely usable. Does that matter to you? Will you write cross-platform clients like this one? How important is standards compliance to your applications? Let us know what you think.


Get the MATLAB code (requires JavaScript)

Published with MATLAB® 7.13

Deployment


Dr. Art Trembanis
Associate Professor
CSHEL
109 Penny Hall
Department of Geological Sciences
The College of Earth, Ocean, and Environment
University of Delaware
Newark DE 19716
302-831-2498

"Education is not the filling of a pot, but the lighting of a fire." -W. B. Yeats

"Between the idea
And the reality
Between the motion
And the act
Falls the Shadow"
- T. S. Elliot

Seadiscovery.com - BlueView to Develop the Full Ocean Depth Multibeam Sonar


http://www.seadiscovery.com/mt/mtStories.aspx?ShowStrory=1050882112

Search Stories: Surfing for the Perfect Wave

Surfs up! What a neat application of Google Earth.  Of course it helps to have a more dynamic view of the surf conditions otherwise you might be chasing some rare event captured in the images.  I've always been impressed with the scientific resourcefulness of surfers over the years.

http://feedproxy.google.com/~r/blogspot/SbSV/~3/tvRRlys8ILs/search-stories-surfing-for-perfect-wave.html

Tips to make the Google Earth Flight Simulator easier to use

Got to remember to try this down in the lab sometime

http://feedproxy.google.com/~r/GoogleEarthBlog/~3/4CHtn8I-5YQ/tips_to_make_the_google_earth_fligh.html

Thursday, September 29, 2011

Using RSS to Monitor Data Transfers.

starOceanDataRat.orgOceanDataRat.org
September 27, 2011 12:38 PM
by admin

Using RSS to Monitor Data Transfers.

I got this idea from a colleague down at Stennis Space Center about a year ago.  He said "Wouldn't it be nice if we could know when data arrives on the server the same way get notified about online news articles?"  The light bulb went on and pretty much exploded.  And why try to replicated the functionality?  Just use the same technology to publish data transfers to the web.  The technology I'm referring to is Real Simple Syndication (RSS), a dirt-bag simple way to publish information that allows anyone to subscribe to receive news updates on all sorts of platforms (browsers, news reader, email clients).

What is RSS?

The last line of the previous paragraph pretty much sums up what RSS does.  How it works is as the name implies, real simple.  RSS is just an XML-based text file hosted on a web server.  The file must adhere to the standardized RSS XML schema but because the XML schema is standardized, all kinds of programs have been written to interpret and display RSS articles.

Here's the basic layout of an RSS file:

<?xml version="1.0" encoding="ISO-8859-1"?> <rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"> <channel> <atom:link href="http://tethys.gso.uri.edu/data/rss.xml" rel="self"  type="application/rss+xml" /> <title>EX Data Transfer RSS Feed</title> <link>http://tethys.gso.uri.edu/data/rss.xml</link> <description>This RSS feed provides updates to when files are synced to the  Shore-based Redistribution Server</description> <language>en-us</language> <copyright>Copyright (C) 2011 Okeanos Explorer Program</copyright> <item> <title>EX1106 Data Upload Update - Tue, 27 Sep 2011 13:30:57 UTC</title> <description> <![CDATA[ <p>Added new file: EX1106/CTD/XBT/EX1106_XBT94_110927.EDF</p> <p>   13 files updated in ./EX1106/SCSData/NAV</p> ]]> </description> <pubDate>Tue, 27 Sep 2011 13:30:57 UTC</pubDate> </item> </channel> </rss>

The breakdown:

  • Line 1 is required, exactly as it appears.
  • Line 2 is the main container, everything within the <RSS> container is interpreted as part of the RSS feed.
  • Line 3 is the container for an RSS channel, think TV channels.  I believe there can be multiple channels in a single RSS feed but I'm not sure how RSS client interpret multiple channels.  For this article I don't use multiple channels.
  • Lines 4-5 add compatibility with ATOM clients.  ATOM is an alternative syndication protocol.
  • Line 6 is the Title of the RSS feed
  • Line 7 is the URL of the feed (or it can be used to link to a parent site)
  • Lines 8-9 are the description for the feed, a.k.a what the RSS feed is propagating.
  • Line 10 is the language of the feed
  • Line 11 is the copyright info.
  • Line 12 is the opening tags for an item (article).
  • Line 13 is the title of the item (article).
  • Line 14 is the opening tag for the meat of the article
  • Lines 15-18 are the meat of the article.  This RSS feed uses HTML-style tags for formatting the text.  This is not required but for the RSS feed I setup it made things easier.   To use HTML-style formatting add "<![CDATA[" at the beginning and "]]>" at the end of the text block.
  • Line 19 is the closing tag for the description.
  • Line 20 is the publishing date/time.  The date/time must be formatted just as it is shown to adhere to the RSS schema standard.
  • Line 21 is the closing tag for the item (article).  At this point addition items can be added.
  • Lines 22-23 are closing tags for the channel and the rss feed.

Now back to the original problem…

The Okeanos Explorer transfers all collected data (sans raw multibeam data and high-definition video) to shore via satellite every hour.  The collection, cataloging, checksum generation and upload all happen auto-magically.

The participants on shore are dependent on this data flow to stay in the know of the ship's findings as well as being able to actively participate in the exploration.  This data dependency created one of the most asked questions… "is it (data) there yet?"

Enter RSS.  By creating an RSS feed based on the successful transfer of data from the ship to shore, shore-side participants are almost instantly informed of when new data has arrived.

How I did it:

For each hourly transfer to shore (via rsync) there is a corresponding log file.  The log file is created by rsync using the "-i" flag.  This produces a list of all the files in the source directory and how each file was interpreted (i.e. as a new file, an updated file or unchanged).  I include the Cruise ID and the date/time of the transfer in the log file name (i.e. EX1104_Transfer_to_Shore_20110810T093000Z).  This is used by my script to populate the <title> and <pubdate> fields in the RSS article.

After a successful data transfer I upload the corresponding rsync log files to a specific directory on the shore-based server.  Once a file arrives on the shore-side server a bash script processes the log file into a RSS article (<item></item>) and adds the article to the beginning of the RSS feed and presto, within seconds of the data arriving, the users are made aware.  After the log file is processed it is moved to a backup directory so that it is not processed again.

In order to minimize the length of each article I only show what's new and what has changed.  For new files I list the file name and full path. For updated files I list the directory name and number of files that were updated.

Once the RSS file is created I save it in the Okeanos Explorer's shore-side web server for the shore-side team to see.  Take a look.

Caveats

Satellite communications at sea can be flakey sometimes due to faulty equipment, tracking issues and weather.  This causes the data transfers to periodically fail.  As part of the Okeanos Explorer's hourly data transfer scripts, the rsync command is called repeatedly until the entire transfer completes or up to five times, whichever comes first.  Each rsync call produces a new rsync log file.  At the end of a successful transfer, all of the logs are sent to shore.  To account for this I wrote a script that batch processes any and all rsync log files within a directory.

The Code

Here is the bash script I use to process the rsync log files: download.

Here is the root RSS file that the articles are added to: download.  I used this file just the first time I run the script.  To add articles to an existing RSS feed you need to run the script against the most recent version.

Here's the script I used to batch process a directory of rsync log files: download.

Both of the scripts are heavily documented so if you have any issues running them please take a look at the comments.

I hope this helps.

Want to talk about this some more? Please post your questions in the Forums.

Share

Data Management bash rss rsync script


Dr. Art Trembanis
Associate Professor
CSHEL
109 Penny Hall
Department of Geological Sciences
The College of Earth, Ocean, and Environment
University of Delaware
Newark DE 19716
302-831-2498

"Education is not the filling of a pot, but the lighting of a fire." -W. B. Yeats

Wednesday, September 28, 2011

The giant wheel-shaped structures in the Middle East

starGoogle Earth Blog
September 27, 2011 8:59 AM
by Google Earth Blog

The giant wheel-shaped structures in the Middle East

First discovered in 1927 by British Royal Air Force fliers, the strange wheel-shaped structures in the middle east are gaining new attention thanks to Google Earth. Researchers have discovered thousands of them in Peru, Jordan, and other nearby countries.

wheels.jpg

Some believe that the structures were used to contain animals, but there is no consensus about that. According to an article on CBSNews.com:

In Saudi Arabia, (David) Kennedy's team has found wheel styles that are quite different: Some are rectangular and are not wheels at all; others are circular but contain two spokes forming a bar often aligned in the same direction that the sun rises and sets in the Middle East.

The ones in Jordan and Syria, on the other hand, have numerous spokes and do not seem to be aligned with any astronomical phenomena. "On looking at large numbers of these, over a number of years, I wasn't struck by any pattern in the way in which the spokes were laid out," Kennedy said.

The function of the wheels may also have been similar to the enigmatic drawings in the Nazca desert.

"If we consider, more generally, the stone circles as worship places of ancestors, or places for rituals connected with astronomical events or with seasons, they could have the same function of [the] geoglyphs of South America, the Nazca Lines for instance. The design is different, but the function could be the same," she wrote in her email.

Kennedy said that for now the meaning of the wheels remains a mystery. "The question is what was the purpose?"

Despite all of the talk of the discoveries of these in Google Earth, very few articles actually provide KML files or coordinates to view them yourself. I've tracked down a handful of sites in northern Jordan, which you can view with this KML file. If you find more, please leave a comment and let us know.

Beyond that, what do you think the purpose of these wheels was for? Practical, religious, astronomical, or something altogether different?

Science


Dr. Art Trembanis
Associate Professor
CSHEL
109 Penny Hall
Department of Geological Sciences
The College of Earth, Ocean, and Environment
University of Delaware
Newark DE 19716
302-831-2498

"Education is not the filling of a pot, but the lighting of a fire." -W. B. Yeats

Dead Sea not so dead, divers discover - Technology & science - Science - OurAmazingPlanet - msnbc.com


http://www.msnbc.msn.com/id/44694754/ns/technology_and_science-science/#.ToN77uuG6UY

Sunday, September 25, 2011

A Few Million Virtual Monkeys Randomly Recreate Shakespeare

starSlashdot
September 25, 2011 11:19 PM
by samzenpus

A Few Million Virtual Monkeys Randomly Recreate Shakespeare

First time accepted submitter allege6a writes "On September 23 at 2:30 PST the A Million Amazonian Monkeys project successfully recreated A Lover's Complaint. This is the first time a work of Shakespeare has actually been randomly reproduced. It is one small step for a monkey, one giant leap for virtual primates everywhere. From the article: 'For this project, I used Hadoop, Amazon EC2, and Ubuntu Linux. Since I don't have real monkeys, I have to create fake Amazonian Map Monkeys. The Map Monkeys create random data in ASCII between a and z. It uses Sean Luke's Mersenne Twister to make sure I have fast, random, well behaved monkeys. Once the monkey's output is mapped, it is passed to the reducer which runs the characters through a Bloom Field membership test. If the monkey output passes the membership test, the Shakespearean works are checked using a string comparison. If that passes, a genius monkey has written 9 characters of Shakespeare. The source material is all of Shakespeare's works as taken from Project Gutenberg.'"

Read more of this story at Slashdot.


idle


Dr. Art Trembanis
Associate Professor
CSHEL
109 Penny Hall
Department of Geological Sciences
The College of Earth, Ocean, and Environment
University of Delaware
Newark DE 19716
302-831-2498

"Education is not the filling of a pot, but the lighting of a fire." -W. B. Yeats

Thursday, September 22, 2011

Elevation Profile - Google Earth Help

Tip sheet on how to create and view an elevation profile in GE 6.  Note- be sure to set elevation to relative to seafloor.  If you select relative to ground it will be flat over the water...learned that the hard way.  

http://earth.google.com/support/bin/answer.py?answer=181393

Wednesday, September 21, 2011

Consolidated Web-based management of backup scripts running on remote computers.

A very useful article from Webb Pinner's blog about setting and managing a diverse at sea data management system.

starOceanDataRat.orgOceanDataRat.org
September 19, 2011 2:55 PM
by admin

Consolidated Web-based management of backup scripts running on remote computers.

Here's a technique I developed two years ago to streamline the process of consolidate datasets collected on multiple computers.  Using some batch scripts, PHP code and some open source tools I created a simple web-based management system for controlling my regular data backup tasks.

Problem: The data collection landscape on the Okeanos Explorer is not the simplest.  There are separate workstations for each of the major collection systems (CTD/XBT, SCS, Multibeam, EK60, etc).  At the beginning of each cruise the ship's survey techs create new folders on each of the collection workstations for that system's datatype (i.e. SBE SeaSave, SIS, SCS).  For size reasons (i.e. multibeam data) some of the collection workstations store their collected data on a share drive (i.e. NetApps storage array).  In these cases the folder is created on the shared drive.  This is not done as standard practice to prevent dependencies on network resources. Typically data remains on the collection workstation to improve the performance of additional product development (i.e. creating maps, calculating SVPs, plotting SST, etc) or for data comparison.

The ship needed a unified data management solution that met the following requirements:

  1. All the collected data needed to be consolidated consistently on a cruise-by-cruise to a single point.
  2. The solution had to be flexible enough to accomidate and ever-changing list of collection points.
  3. It needed to be simple enough that it could be managed with a minimum amount of the survey tech's time.
  4. It needed to be platform independent.
  5. It needed to enforce data management policies (i.e. naming conventions) where possible.

Solution: The first thing that was needed was a consolidated collection point, something the Okeanos Explorer calls the shipboard data warehouse (Warehouse).  The Warehouse has enough storage for all datasets (sans raw High Definition (HD) video and raw multibeam data) for an entire field season (~800GB).  The hardware is reasonably fault-tolerant; Dell PowerEdge 2950 Server, rack-mount, dual NICs, dual power supplies, 8 hot-swappable 150GB SAS drives connected to a hardware RAID controller.  The server is running Debian 6 (Linux)

We used rsync batch scripts called as a scheduled tasks (Linux/Mac used BASH scripts and cron jobs) to transfer the data from the collection computer to the Warehouse (Refer to Using RSYNC to Efficiently Backup Data).  The rsync jobs run every hour.  We use rsync's –include/–exclude arguments to enforce naming conventions.  The rsync jobs are tailored such that the data from each collection point is copied to the standardized directory location on the Warehouse regardless of the original directory name.  A new directory structure is created on the Warehouse for each cruise that contains the cruise id (i.e. EX1104).  Error checks are performed in the scripts where ever possible.  Any errors as well as successes are reported to the collection workstation as Growl notifications (Refer to Using GrowlNotify to Send System-wide Notifications From Scripts).

Having all the data in one place was extremely useful to ship's crew and science alike.  This prompted us to make the consolidated datasets publicly available (read-only) via FTP, SMB and HTTP.  To quell security concerns we moved the Warehouse to the visitor's network and altered the backup scripts to use SSH tunneling for the transfers (Refer to Setting Up SSH Public Key Authentication).

All of the backup scripts behave based on a centralized configuration file.  The configuration file contains all of the local and remote (on the Warehouse) directory names.  The configuration file lives on the Warehouse and is access by the collection workstations via http using the wget utility (wget for Windows).  Once the file is downloaded, the variables are loaded into the shell environment, the local copy of the file is immediately deleted (for security) and the script does it's job.  When the shell completes (success or fail) the variables are erased (again for security).

Dataflow Diagram

Dataflow Diagram showing flow of configuration files and data.

Management of the configuration file is web-based.  A secure website (via .htaccess) running on the Warehouse contains a web form for altering the master configuration file.  At the beginning of each cruise the survey tech updates the directory information as required and hits a "save" button.  The next time the scripts run the new variables will be applied.

ODR_webdata

Web-based configuration file control.


Additional bells and whistles: While in port the ships's are turned off.  The scheduled tasks used for the automatic backups however are not.  There are two ways to handle this.

  1. Go around to each of the collection workstations and disable all the scheduled tasks
  2. Use the central configuration file to disable the backup scripts.

We went with the latter approach.  A variable in the central configuration file serves as the master switch that will prevent the script from reaching the rsync command.

The configuration file also contains the cruise ID which is used as part of the ship's naming convention and is the name of the top-level cruise directory.

The Code:

Here's the code for the website, big thanks to friend and honorary datarat Eric Martin: download.  Unzip this into the document root folder for your website (i.e. /var/www).  You will need to open index.php and set the $batFile variable for your particular installation.

Here's a sample backup script that uses the method described in this article: download.  You will need to change the HOMEPATH, CONFIG_FILENAME and CONFIG_URL variables for your particular installation.

I hope this helps.

Want to talk about this some more? Please post your questions in the Forums.

Share

Datalogging backup batch Growl Linux rsync script ssh web wget Windows


Dr. Art Trembanis
Associate Professor
CSHEL
109 Penny Hall
Department of Geological Sciences
The College of Earth, Ocean, and Environment
University of Delaware
Newark DE 19716
302-831-2498

"Education is not the filling of a pot, but the lighting of a fire." -W. B. Yeats

Simple3D -- 3D Scanners, Digitizers, and Software for 3D Models and Measurements

A lot of links here to laser scanner systems.

http://www.simple3d.com/


Dr. Art Trembanis
Associate Professor
CSHEL
109 Penny Hall
Department of Geological Sciences
The College of Earth, Ocean, and Environment
University of Delaware
Newark DE 19716
http://cshel.geology.udel.edu
302-831-2498

"Education is not the filling of a pot, but the lighting of a fire." -W. B. Yeats

Pico Projector + Light Fixture + Free Code = Desktop Spherical Display

Science on a snow globe!

starMAKE
September 19, 2011 9:00 AM
by Sean Michael Ragan

Pico Projector + Light Fixture + Free Code = Desktop Spherical Display

Sometimes, I get this feeling like I've seen it all—that nothing that comes along is ever going to inspire or delight me the same way that certain ideas, systems, inventions, and/or artworks did when I was younger. It always passes, sooner or later, but while I'm under that spell it can be…well, it can be a bit depressing, honestly. So I feel like I ought to thank International Man of Mystery Nirav Patel, somewhat more personally than usual, for making and sharing this wonderful thing. I am inspired.

He calls it Science on a Snow Globe, and it was, itself, inspired by NOAA's Science on a Sphere project. Whereas the Science on a Sphere globe displays are 8′ across, use four projectors and five computers apiece, and cost thousands of dollars individually, Nirav's system sits on a desktop, projects onto an 8″ frosted glass lamp globe, uses a single laser picoprojector and a single computer, and costs about $200. Nirav writes:

The basic design here is to shoot a picoprojector through a 180° fisheye lens into a frosted glass globe. The projector is a SHOWWX since I already have one, but it likely works better than any of the non-laser alternatives since you avoid having to deal with keeping the surface of the sphere focused. Microvision also publishes some useful specs, and if you ask nicely, they'll email you a .STL model of their projector. The lens is an Opteka fisheye designed to be attached to handheld camcorders. It is by far the cheapest 180° lens I could find with a large enough opening to project through. The globe, as in my last dome based project is for use on lighting fixtures. This time I bought one from the local hardware store for $6 instead of taking the one in my bathroom.

Nirav printed a custom bracket that holds projector, lens, and globe together in the right arrangement, and mounted the whole thing on a small off-the-shelf tripod. Lots of nummy technical details are available here, and the code, which Nirav wrote himself, is at Github. [Thanks, Matt Mets!]

More:

Electronics Imaging Science display masterpieces projection sphere


Dr. Art Trembanis
Associate Professor
CSHEL
109 Penny Hall
Department of Geological Sciences
The College of Earth, Ocean, and Environment
University of Delaware
Newark DE 19716
302-831-2498

"Education is not the filling of a pot, but the lighting of a fire." -W. B. Yeats

Google Earth 6.1 now available: New features make it easier than ever to explore your world

Can't wait to try this out especially the elevation tool.

starGoogle LatLong
September 20, 2011 4:30 PM
by Lat

Google Earth 6.1 now available: New features make it easier than ever to explore your world


Today we are pleased to announce new features available in Google Earth. The Google Earth 6.1 update includes enhancements to make Google Earth easier than ever for both everyday users and business professionals.

Easier to use My Places
If you're like me, your growing collection of maps in the My Places panel is getting a bit unwieldy. Every time I find a great new map or upload a new GPS track, it gets a little harder to find things. With this release, we've added a couple of new features to help you clean house a bit and find things more easily. First, we've added the ability to sort a folder - just right click on any folder and choose "Sort A-Z." We've also made our My Places search feature easier to find; now all you have to do is type in the name of a map or a feature and it will highlight in the My Places panel.

You can now sort your My Places folders to improve organization.

Improved Street View
Building on the improvements we made to the Street View experience in Google Earth 6, we've now added even more Street View features, including better zoom control through the slider tool and a wider field of view similar to Google Maps. You can now also navigate from one place to another with just a single-click of the mouse. These features make Street View in Google Earth more immersive, while performance improvements create a faster, smoother overall experience.

Street View in Google Earth now has a wider field of view.

Google Earth Pro
While these features are available to all of our users, much of the work we've done in Google Earth 6.1 benefits power users and professionals who use Google Earth Pro, including:
  • Enhanced print layout: Pro users can now include scale bars and directional arrows when printing, making it easy to include all relevant information in client presentations.
  • Simplified movie maker: It's now easier to convert saved tours to video and record live actions from the 3D viewer to really bring your presentation to life.
  • Expanded data styling: Control up to 64 unique style attributes for imported datasets.
  • Improved networking infrastructure: Earth Pro 6.1 received a robust network update, which offers better support for network proxies and SSL certificates commonly found in corporate networking environments.
  • Combined elevation profiles and ruler tool: We know that sometimes distance is only one part of the equation. We've tied elevation profiles into the ruler tool, making it possible to take into account the entire 3D environment when measuring distance.
Combined ruler and elevation profile tool used to measure Yosemite's Half Dome Peak.

We hope these enhancements make it even more fun and exciting to explore the planet, wherever you are in the world. Download Google Earth 6.1 to get started.

Posted by Peter Birch, Product Manager
Google Earth Street View


Dr. Art Trembanis
Associate Professor
CSHEL
109 Penny Hall
Department of Geological Sciences
The College of Earth, Ocean, and Environment
University of Delaware
Newark DE 19716
302-831-2498

"Education is not the filling of a pot, but the lighting of a fire." -W. B. Yeats

Tuesday, September 20, 2011

Cluster Hire: Three Assistant Professors in Geological / Physical Oceanography


For anyone interested here is a message about a cluster hire that has just been announced at VIMS.

Begin forwarded message:

From: Carl Friedrichs <cfried@vims.edu>
Date: September 20, 2011 11:43:55 AM EDT
To: Carl Friedrichs <cfried@vims.edu>
Subject: Cluster Hire: Three Assistant Professors in Geological / Physical Oceanography

Please pass on this job opportunity announcement to your colleagues (apologies for any cross-postings):

The Department of Physical Sciences at the Virginia Institute of Marine Science of the College of William & Mary, located in Gloucester Point, VA, USA, seeks applicants for tenure-track faculty positions to begin July, 2012.  Successful candidates will join a department that focuses on continental margin (especially coastal and estuarine) systems and includes chemical, geological, and physical oceanography. This union facilitates the synergy needed to address today's interdisciplinary research problems, environmental issues, and challenges.

Applicants with research interests in the broad areas of physical oceanography, sedimentology, stratigraphy and climate change studies are welcomed, with preference for research focused in estuarine, coastal, and continental margin settings.  Candidates with demonstrated excellence in field-oriented research are encouraged to apply, but they also should be comfortable employing modeling, experimental, or laboratory approaches.  The successful candidates will be poised to take advantage of opportunities for interdisciplinary collaborations that abound at VIMS and the College of William and Mary.  Candidates are especially encouraged to apply who employ quantitative approaches for understanding natural and anthropogenic processes (at present and/or during recent geological times) affecting coastal and estuarine environments.

The successful candidates will be expected to build exemplary research and publication programs, participate in the educational program, and provide service to the Commonwealth of Virginia.  The successful candidates should be capable of teaching physical or geological oceanography at the introductory graduate level; advanced graduate courses in their specialty; and ideally courses emphasizing cross-disciplinary approaches and quantitative methods.

For more information see the department web-pagehttp://www.vims.edu/research/departments/physical/ or contact any department faculty member directly.  The department web-page includes links to detailed job descriptions and the electronic application site at jobs.wm.edu, where requirements for the application package can be found.  For full consideration, submit application materials by December 1, 2011.  Applications will be accepted until the positions are filled, however.

The employer is an equal opportunity/affirmative action university.  Applications by persons from under-represented groups are strongly encouraged.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Carl T. Friedrichs, Professor of Marine Science
Chair of Department of Physical Sciences
Virginia Institute of Marine Science, College of William & Mary
Mail: VIMS, P.O. Box 1346, Deliveries: VIMS, Route 1208, Greate Road
Gloucester Point, VA, 23062-1346, USA
tel. +1-804-684-7303, fax. +1-804-684-7250, email cfried@vims.edu
http://www.vims.edu/people/friedrichs_ct/

Navy’s Newest Ship Is Pickup Truck of the Sea | Danger Room | Wired.com

http://m.wired.com/dangerroom/2011/09/pickup-truck-of-the-sea/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+wired%2Findex+%28Wired%3A+Index+3+%28Top+Stories+2%29%29

How to speed up Google Earth

Good tips here
http://feedproxy.google.com/~r/GoogleEarthBlog/~3/9vbc6ikWVXs/how_to_speed_up_google_earth.html

Saturday, September 17, 2011

Underwater Photo Gallery

starMAKE
September 14, 2011 12:00 PM
by Sean Michael Ragan

Underwater Photo Gallery

As gimmicks go, this one from diver and photographer Andreas Franke is pretty smart: The novelty of an "underwater gallery" is likely to bring lots of attention to his work in the press (heck, it worked for me), and the situation of the exhibit itself pre-screens for clients who are wealthy enough, both in terms of free time and in terms of disposable income, to appreciate and buy expensive art. And it's a relatively safe bet that many who invest the time and energy to actually dive to the gallery will end up buying something, regardless of its artistic merit, if only so they can point at it on the wall later and say, "I bought that photo at an underwater gallery."

Or, it could very well be, Franke has done this solely out of an unselfish desire to create random beauty and delight in the world. In which case, Mr. Franke, please accept my sincere apologies.

In any case, the site of the exhibit is the wreck of the General Hoyt S. Vandenburg (Wikipedia), launched in 1943 and deliberately sunk in 2009 to form an artificial reef off Key West. It's about 100 feet down. [via Dude Craft]

More:
Gallery of underwater sculpture

GPS Made On Earth Photography exhibits underwater