Mobile Web: Homework and Project Proposal

For this week’s mobile web homework, we were to open the default PhoneGap app project within Eclipse and then mess with the underlying HTML and CSS within the JavaScript developer console.  I did some simple selections, creating a variable “test” to store the object of HTML containing all the “H1” elements, using “document.getElementsByTagName”.  After that, I changed the innerHTML, style.color, and style.background to demonstrate how anything in the DOM can be manipulated using the console.

Screenshots:

HTML source

changing some H1 tags to black background

changing some H1 tags to red text

We also for our homework had to work on our final project proposals.  Me and Phil are working together on LiveBeam.  We have some pencil wireframe photos below, and they will be converted into the front end next week.  The plan is below.  We entered LiveBeam into the NYU ITP Pitch Fest, as well.

Project: LiveBeam

Team:

Phil Groman, Ben Turner

Summary:

LiveBeam is your eyes on the ground.  It coordinates people who want to see images or updates or video from a remote location, together with people with mobile phones who can provide that on-the-ground real-time information.

Core Functionality:

  1. There is a dynamic interface with a map locating active Broadcasters worldwide.  A “viewer” sees a “broadcaster” at a certain location on a digital map in his browser.  Broadcasters can update their profiles concerning events/ situations happening around them.
  2. The viewer clicks on the broadcaster and requests multimedia from the broadcaster by way of sending a message through LiveBeam.
  3. The broadcaster can accept the request and can look at the requesting viewer’s profile, captures the multimedia, and shares it via LiveBeam.
  4. The viewer can chat with the broadcaster to direct the information, and leaves feedback and rates the quality and relevance of the video feed.
  5. Levels of openness and privacy can be controlled by the broadcasters to only allow video requests from certain contact groups or individuals.
  6. The viewer confirms receipt of the multimedia, ending the transaction and registering appropriate credit to both parties automatically.

 

Potential Use Cases:

  1. Check conditions at specific locations (traffic / weather / waves / lines at restaurants)
  2. Share live sports / music events
  3. Watch / broadcast breaking news as it happens
  4. Stalk / spy on people
  5. Idle travel to interesting locations
  6. Live pornography
  7. News organisations can access a global network of amateur video journalists
  8. Friends can offer a request-based video feed in Facebook status updates — “Beautiful sunset over the East River — Join Me at LiveBeam”
  9. Allows anyone to become a Broadcaster and to build a live audience

 

Potential Revenue Streams:

  1. Advertising — banners and pre-rolls
  2. Subscription services — premium accounts with more viewing and broadcasting privileges and functionality
  3. Partnership with a news agency
  4. Sales of branded mobile handsets, optimized for broadcasting
  5. Promotion of top curators and broadcasters, having them pay a fee for premium (see #2) but share revenue with them like YouTube does

 

Stage 1: Wireframing and Proposal (Due: 09 Feb 12)
Show wireframes of what various screens may look like.

  • Tasks:
    • Convert meeting notes into proposal and project plan
    • Make wireframes showing mockups of main screens used by broadcasters and viewers
    • Documentation for class
  • Questions / Pitfalls:
    • Are we staying focused on our core functionality to get that working first?
    • Will we have enough time to build core functionality within the course’s 7-week timeframe?
  • Resources:
    • Adobe Photoshop, meetings, Google Docs

 

Stage 2: Front-End Design (Due: 16 Feb 12)
Convert wireframes to a front-end with dynamic interface, using jQuery Mobile, jQuery, CSS, JavaScript.

  • Tasks:
    • Make main user interfaces for broadcasters and viewers
    • Decide on buttons and layout
  • Questions / Pitfalls:
    • Avoid making the interface too complex with too many features or buttons
  • Resources:
    • jQuery Mobile, jQuery, CSS, JavaScript, virtual web host, PhoneGap

 

Stage 3: Back-End Design (Due: 23 Feb 12)
Add a back-end database to save data, change front-end so it can interface with the database.

  • Tasks:
    • Link up database with appropriate search and data entry queries via PHP/MySQL
    • Add in privacy settings (Google+ circles possibly, or just add individuals)
  • Questions / Pitfalls:
    • May not know exactly how we want to structure our data or db interfaces
  • Resources:
    • PHP, MySQL, HTML

 

Stage 4: User Handshaking (Due: 23 Feb 12)
Test and strengthen robustness of handshaking and connection between viewer and broadcaster.  Add in GPS/geolocation/triangulation/manual location entry.

  • Tasks:
    • Test with dummy accounts the basic interactions between viewers and broadcasters
  • Questions / Pitfalls:
    • Keep the interactions simple, work iteratively so as to avoid deep-rooted bugs
    • Will we need a user authentication system or can we use fake accounts for a demo?
  • Resources:
    • Other students, PHP, MySQL, PHP code breakpoint and use case testing

 

Stage 5: Multimedia Input and Sharing (Due: 01 Mar 12)
Add in video streaming or use external source.  Share video, photos, audio, text through LiveBeam after evaluating best options for each.

  • Tasks:
    • Add in GPS or other geolocational sharing
    • Allow broadcaster to send or share video and images
    • Allow viewer to see chat, video, etc. in viewer’s window
  • Questions / Pitfalls:
    • How accurate can we rely on the geolocation to be, or do we need to work around it?
    • Can we capture video or photos into PhoneGap?
    • Can we share video and photos through the app or will we rely on third-parties?
    • Will we need to link up with other services’ APIs?
    • Will we need to provide links to other services to show multimedia?
  • Resources:
    • PhoneGap, virtual web server, data hosting

 

Stage 6: Use Cases and User Testing (Due: 01 Mar 12)
Test with multiple users and pre-populated testing accounts.  Conduct actual tests within Manhattan.

  • Tasks:
    • Test core functionality with other ITP students
    • Fix bugs and add in redundancy for potential failpoints
  • Questions / Pitfalls:
    • Will need a lot of time to conduct proper user testing
  • Resources:
    • Other ITP students, extensive note-taking, video recording, screen capturing?

 

Stage 7: Build a Presentation (Due: 08 Mar 12)
Prepare a slide presentation including research, explanation of core functionalities, potential use cases.

  • Tasks:
    • Make slidedecks for presentations, both technical and business
    • Competitor research
    • Market research
    • Results of user testing
    • Explanation of core functionalities
    • Explanation of potential use cases
  • Questions / Pitfalls:
    • May need to conduct this stage throughout in order to stay on track
    • May need to wait to build business until after successful user adoption
  • Resources:
    • Meet with entrepreneurship mentors and Stern folks for advice
  • Sean Montgomery

    Great break down of your ‘stages’. My guess is that you won’t get half of it done by March 12th, but please prove me wrong. With regard to pitfalls, I’m not 100% sure how you’re going to stream video and audio. As such, I would start with pictures and see if you can get that working, but do some research on how you might stream audio/video so that whatever design choices you make now will be compatible with streaming down the road.I couldn’t find your photoshopped screens.8/10