Mobile Web Final: StreetEyes

Phil Groman and I presented our mobile web app, StreetEyes, for our NYU-ITP mobile web class.  StreetEyes is intended to be a way for a web viewer to look at a map of broadcasters and ask them to collect information on a desired event, protest, or location.  The app allows a web user to view a map of broadcasters, select one, and make a view request to him to start collecting data from the ground and reporting it to the news “stream” that the viewer made from the web.  The viewer and broadcaster (or multiple viewers) could chat in real-time to assign tasks for info collection or to take more photos or to get more descriptive updates.

We feel a good application of this would be for documenting a quick-moving protest, or for getting eyes on a line at a supermarket or at the DMV.  Traditionally, news is collected by sifting through peoples’ random Twitter updates, or more crafted news stories that come later.  If some investigative curator or someone who knows which details are pertinent in an emergency (say, a first-responder), then there is not much way to contact people on the ground to gather data.  We rely on people on the ground to be knowledgable enough to get the right data, but this is the wrong way to go about things.

Why not have curators direct people to breaking news or emergencies?



  • Make the camera work, submit photo to a stream.
  • Fix display bugs in Google Map dynamic view.
  • Much better styling for the stream page.
  • Video-streaming?
  • User authentication and multi-user potential.


Final note.  We feel as though video-streaming apps are stuck on just getting the video up.  Ustream has a nice Android app that very quickly allows you to view a full-screen stream, or start your own.  But it does not function well in disseminating information, particularly on a directed topic.  We would like to be able to embed the video into a stream so that the end result is like a curated Reuters news feed, which differentiates between admins, editors, and random commenters.  Those last commenters are very important though, as they may actually be on-site and have a lot of info to share.  It is a curator’s role to promote that info and demote the chaff.

Another thing.  We found prezi very useful for our presentation, as well as the app Droid@Screen, which is a Java file that goes in your Android SDK folder containing adb.  That app allows you to display what’s going on on your phone’s display onto the monitor, which can then be projected onto a wall or large screen.

Mobile Web: 2 Classes Left

With a couple classes to go, here is the status of StreetEyes, the app made by Phil and me:

Mainly what we have left to do is add some markers of some dummy users so that we can do a view request from them, and then have that data passed on via database so that they can see it, and then accept it.  We need to make sure the HTML forms passing along the broadcaster’s submission as well as the view request’s submission are linked up.

Perhaps the most work needs to be done on making the beacon page viewable so that you can easily scroll down a list of updates sent by various people, with new updates added dynamically.

Mobile Web: Week #3 Homework

For the homework, we were supposed to go through some JavaScript exercises in the browser console to play with the Document Object Model and with events.  We were also supposed to post some screenshots of the first pages made for our mobile apps.  To avoid spamming my blog, the rest of the post is below the jump:

Read More »

Mobile Web: Homework and Project Proposal

For this week’s mobile web homework, we were to open the default PhoneGap app project within Eclipse and then mess with the underlying HTML and CSS within the JavaScript developer console.  I did some simple selections, creating a variable “test” to store the object of HTML containing all the “H1” elements, using “document.getElementsByTagName”.  After that, I changed the innerHTML, style.color, and style.background to demonstrate how anything in the DOM can be manipulated using the console.


HTML source

changing some H1 tags to black background

changing some H1 tags to red text

We also for our homework had to work on our final project proposals.  Me and Phil are working together on LiveBeam.  We have some pencil wireframe photos below, and they will be converted into the front end next week.  The plan is below.  We entered LiveBeam into the NYU ITP Pitch Fest, as well.

Project: LiveBeam


Phil Groman, Ben Turner


LiveBeam is your eyes on the ground.  It coordinates people who want to see images or updates or video from a remote location, together with people with mobile phones who can provide that on-the-ground real-time information.

Core Functionality:

  1. There is a dynamic interface with a map locating active Broadcasters worldwide.  A “viewer” sees a “broadcaster” at a certain location on a digital map in his browser.  Broadcasters can update their profiles concerning events/ situations happening around them.
  2. The viewer clicks on the broadcaster and requests multimedia from the broadcaster by way of sending a message through LiveBeam.
  3. The broadcaster can accept the request and can look at the requesting viewer’s profile, captures the multimedia, and shares it via LiveBeam.
  4. The viewer can chat with the broadcaster to direct the information, and leaves feedback and rates the quality and relevance of the video feed.
  5. Levels of openness and privacy can be controlled by the broadcasters to only allow video requests from certain contact groups or individuals.
  6. The viewer confirms receipt of the multimedia, ending the transaction and registering appropriate credit to both parties automatically.


Potential Use Cases:

  1. Check conditions at specific locations (traffic / weather / waves / lines at restaurants)
  2. Share live sports / music events
  3. Watch / broadcast breaking news as it happens
  4. Stalk / spy on people
  5. Idle travel to interesting locations
  6. Live pornography
  7. News organisations can access a global network of amateur video journalists
  8. Friends can offer a request-based video feed in Facebook status updates — “Beautiful sunset over the East River — Join Me at LiveBeam”
  9. Allows anyone to become a Broadcaster and to build a live audience


Potential Revenue Streams:

  1. Advertising — banners and pre-rolls
  2. Subscription services — premium accounts with more viewing and broadcasting privileges and functionality
  3. Partnership with a news agency
  4. Sales of branded mobile handsets, optimized for broadcasting
  5. Promotion of top curators and broadcasters, having them pay a fee for premium (see #2) but share revenue with them like YouTube does


Stage 1: Wireframing and Proposal (Due: 09 Feb 12)
Show wireframes of what various screens may look like.

  • Tasks:
    • Convert meeting notes into proposal and project plan
    • Make wireframes showing mockups of main screens used by broadcasters and viewers
    • Documentation for class
  • Questions / Pitfalls:
    • Are we staying focused on our core functionality to get that working first?
    • Will we have enough time to build core functionality within the course’s 7-week timeframe?
  • Resources:
    • Adobe Photoshop, meetings, Google Docs


Stage 2: Front-End Design (Due: 16 Feb 12)
Convert wireframes to a front-end with dynamic interface, using jQuery Mobile, jQuery, CSS, JavaScript.

  • Tasks:
    • Make main user interfaces for broadcasters and viewers
    • Decide on buttons and layout
  • Questions / Pitfalls:
    • Avoid making the interface too complex with too many features or buttons
  • Resources:
    • jQuery Mobile, jQuery, CSS, JavaScript, virtual web host, PhoneGap


Stage 3: Back-End Design (Due: 23 Feb 12)
Add a back-end database to save data, change front-end so it can interface with the database.

  • Tasks:
    • Link up database with appropriate search and data entry queries via PHP/MySQL
    • Add in privacy settings (Google+ circles possibly, or just add individuals)
  • Questions / Pitfalls:
    • May not know exactly how we want to structure our data or db interfaces
  • Resources:
    • PHP, MySQL, HTML


Stage 4: User Handshaking (Due: 23 Feb 12)
Test and strengthen robustness of handshaking and connection between viewer and broadcaster.  Add in GPS/geolocation/triangulation/manual location entry.

  • Tasks:
    • Test with dummy accounts the basic interactions between viewers and broadcasters
  • Questions / Pitfalls:
    • Keep the interactions simple, work iteratively so as to avoid deep-rooted bugs
    • Will we need a user authentication system or can we use fake accounts for a demo?
  • Resources:
    • Other students, PHP, MySQL, PHP code breakpoint and use case testing


Stage 5: Multimedia Input and Sharing (Due: 01 Mar 12)
Add in video streaming or use external source.  Share video, photos, audio, text through LiveBeam after evaluating best options for each.

  • Tasks:
    • Add in GPS or other geolocational sharing
    • Allow broadcaster to send or share video and images
    • Allow viewer to see chat, video, etc. in viewer’s window
  • Questions / Pitfalls:
    • How accurate can we rely on the geolocation to be, or do we need to work around it?
    • Can we capture video or photos into PhoneGap?
    • Can we share video and photos through the app or will we rely on third-parties?
    • Will we need to link up with other services’ APIs?
    • Will we need to provide links to other services to show multimedia?
  • Resources:
    • PhoneGap, virtual web server, data hosting


Stage 6: Use Cases and User Testing (Due: 01 Mar 12)
Test with multiple users and pre-populated testing accounts.  Conduct actual tests within Manhattan.

  • Tasks:
    • Test core functionality with other ITP students
    • Fix bugs and add in redundancy for potential failpoints
  • Questions / Pitfalls:
    • Will need a lot of time to conduct proper user testing
  • Resources:
    • Other ITP students, extensive note-taking, video recording, screen capturing?


Stage 7: Build a Presentation (Due: 08 Mar 12)
Prepare a slide presentation including research, explanation of core functionalities, potential use cases.

  • Tasks:
    • Make slidedecks for presentations, both technical and business
    • Competitor research
    • Market research
    • Results of user testing
    • Explanation of core functionalities
    • Explanation of potential use cases
  • Questions / Pitfalls:
    • May need to conduct this stage throughout in order to stay on track
    • May need to wait to build business until after successful user adoption
  • Resources:
    • Meet with entrepreneurship mentors and Stern folks for advice

Dynamic Web Dev & Mobile Web: Final Project Proposals?

For my Dynamic Web Development class (DWD) (syllabus), my first homework assignment was to lay out a proposal for a final project.  Here is the class description:

“The class will cover server-side and client-side web development topics using JavaScript. On the client-side, we will cover traditional JavaScript and the jQuery library to manipulate browser content, create and trigger page events and make AJAX data requests. Developing with NodeJS on the server-side, we will explore receiving input from a user then querying and saving that data to a database, and finally, returning the appropriate content to the client, i.e. HTML or JSON. The websites we use today are rarely on a single database, we will focus on consuming data APIs from websites like Foursquare (for location information), Facebook (for social graph) and Twilio (for SMS and telephony). Going further, we will create custom data APIs for use at ITP and open to the public.”

I’m looking forward to using node and also Twilio (I’ve dabbled a bit in both).  I think Twilio interaction may overlap a bit with my Redial class, which uses the open-source phone comms software Asterisk.

I’ve been working through some codecademy exercises, as instructed on the homework assignment.

I think this class will also dovetail nicely with the mobile web class I’m taking.

For these reasons, I wasn’t entirely sure which idea I wanted to do for my final project just yet.  It could be that I build the back-end for an app for my DWD class but then do a front-end for an Android phone in mobile web class.

Final Project Ideas

Moment Quests:

Part scavenger hunt, part walking tour, part questing/geocaching.  Say you send your girlfriend on a moment quest by sending an invite to her on her mobile phone or browser.  She can accept the quest or delay it or do another quest instead.  Rewards are important — you might not want to do a quest if the payoff is too low!  An instruction will tell her what to do next, where to go, etc.  Once she’s there, she has to offer proof that she’s completed the step: a photo of a receipt, a photo of a landmark, a description of a sign at the exact point, etc.  Then the next stage is unlocked, either automatically or manually by you upon verification.Once she completes all the necessary steps in the quest, a reward is unlocked, or the final step in the quest takes her to her award (e.g. you’re waiting there for her, a gift is given by someone at the location, a secret GPS coordinate is unlocked, etc.).

What’s great is that the infrastructure is flexible — it would just require some GPS checks or a way to approve submitted proof of completing tasks.  You could also make public moment quests.  Companies or promotional events could do one-day quests which pit people against each other to be the first ones to finish, or to find something.Part of what inspired me to do this was Daniel Suarez’s book Daemon, in which the AI software chooses its chief real-world henchman, Loki, after he completes its quest: he’s an avid Wolfenstein gamer who comes across a unique server with an original map which is nearly impossible to beat.  Loki figures out how to beat it though, and then he is initiated into the AI’s recruiting process, where he has to drive out into the middle of nowhere and find his way into an unmarked building.  Once he passes these stages, he’s given a meshnet of killer motorbike drones and other special equipment.  Pretty wicked.

The app will probably require a mobile component, which might use the internal GPS as well as the camera for verification.  A backend will have to be built to store details about the moment quests and what stage of completion a person is through them.  Personal details will need to be saved as well to save info on awards, unlocks, communications between the quester and the dungeonmaster, etc.


I worked on probablyGonna last semester for my comm lab web class, writing it in Ruby/Sinatra. I still need to make it more robust and useful, so it could be a good project for DWD.  It involves the idea that you may know you want to go out dancing Saturday night, but you don’t know who else wants to go or who else is already going.  So you put out an general invite and see if anyone else wants to go.  Or say you’re at school and you’re heading out for lunch; you put out an immediate alert for anyone who’s hungry to come join you at this or that place, and so it facilitates future event planning on the fly, unlike Google Calendar which is fairly regimented, or FourSquare, which only seems to capture peoples’ presences at locations after they’re about ready to leave that place.


Another classmate of mine, Phil, had the idea to create an app that lets an online social media curator tap into a network of available cell phone reporters.  So say there’s a breaking news event occurring at Zuccotti Park, and a web journalist looks at a location map of nearby reporters who could go to the site and film it or record it.  LiveBeam would allow the curator to ping a reporter and see if he could head over and cover the event.

What follows is my interpretation of his idea — he may have something different in mind!

The reporter would be selected based on proximity to the event, the reporter’s reputation for producing a certain type of content (liberal, conservative, streaming video, professional photos, etc.), and availability.  I found that, in my previous job doing social media emergency management, sometimes reporters didn’t know where the action was, or maybe the reporters were screaming at their editors or bosses that the action would be here instead of over there.  Sometimes the best information on something like a remote-area wildfire in New Mexico or a passport fraud bust in Anchorage would never make it to the mainstream news, or maybe one or two regular joes or local reporters would cover the news.  You can’t always rely on the “best” news sources to deliver all the news and information promptly, particularly if a client is looking for more specific, targeted news that the broader outlets ignore.

There is often information asymmetry in emerging crises, and sometimes the best journalist will not be at the proper location, so there needs to be a way to reallocate reporters to proper sites, or signal the best reporter for a given scenario as the one to follow.  You see this on Twitter during a crisis when the top journos tell everyone to Twitter-follow certain people who seem to be producing the best content possible.

People who continuously deliver the best content after being pinged on LiveBeam could have a higher reputation on a site and would become the people who’d be pushed to head on-site.  But there’d still be other options for redirecting traffic towards the best-positioned journo/reporter in any given crisis.  The other part of this is that there is not really enough recognition online for the role of the curator, whose job it is to filter through all of the noise generated on the internet every day and pick only the most important or under-covered stories, adding his or her own editorial take on why the issue is important relative to others.

LiveBeam is likely to need, at the least, backend storage of how to link the curator to the reporter, and GPS capability to place the reporter on a map so s/he can be notified of directions by the curator.  S/he will also probably want the ability to request a curator turn his eye of Mordor onto another event if that event is deemed more important by a reporter.  Or perhaps votes can be tallied by the top emergency management people to raise awareness of an issue.

Mobile Web Homework

Finally, for mobile web we had to create a quick initial app using some buttons, images, and text.  Below’s a screenshot from the Android emulator (running 2.2):

We’re going to be using some PhoneGap for this class, and I added in some jQuery Mobile to play with what they have to offer.  Download the .zip source. Source of assets/www/index.html below: