web analytics

— urbantick

Archive
Tag "locationInformation"

It is here, finally for the iPhone. Layar is available and with it a whole series of information packages. There is nothing new on it, but the way it is visualised is new. You get familiar stuff like Wikipedia, Open Street Map, ArchINFORM, Twitter, Panoramio, flickr, Brightkite plus a lot more. There is a high chance that the library will grow dramatically in the next few months. Currently there are a lot of services from Japan as layers available, as well as from the Netherlands. An up to date list of Layar layers can be found HERE.
Layar is basically the browser that visualises the data provided by individual companies offering a specific service. Download the app for your iPhone here.
So lets have a look at how it looks and feels, by testing some services around CASA.

layarOSMhorizon01.IVsI4GsYF8lK.jpg
Image by urbanTick – Screenshots, Depending on the angel, Layar adjusts the horizon line of the overlaid plane that serves as a reference for the displayed data.

The reference information is drawn from the GPS / Wi-Fi / Network to establish the current location. The compass built in to the iPhone give the direction of the phone. Layar provides a grid plane to locate the information and presumably give a better sense of depth. The icons used to represent the information are rather simple, a circle, a square, … The interaction with these objects is limited to select them. It turns out that this is a difficult task at times. One because it is a rather small area of the screen that is available for the actual AR display (the rest is cluttered with backup information) and two because the icons are overlapping one another and are obviously displayed even smaller if they are further away from the present location. However there is a automatic selection that works fine if there are only one or two items on the screen and by moving the iPhone you can alter between them, but as soon as you get more items the sensitivity of the compass can not keep up with the millimeter differences between the items.

layarBrightkite01.K1FdFbeMuNeV.jpg
Image by urbanTick – Screenshots, Brightkite layer on Layar

The top bar holds a setting button that contains a number of options related to the service. For example the range/distance within results are displayed can be adjusted. The second bar on the top allows to switch between a map, a list and the AR mode, here called reality, WOW! Additional information for each selected item is displayed in the box below. It also provides a link to the displayed contend at its original location on the web. Meaning, Layar is really just a window to search for stuff. In this respect it could increasingly compete with Google and this raises the question why Google has not yet developed their own service or when they will buy Layar?
Well at this point is still a very crude application with a rather cluttered and ugly interface, crappy icons and not very intuitive handling. But you know it is a first stab at a commercial platform to display location based information projected onto reality though the lens of a camera and this is exciting enough.
How beautiful and simple this could look like was shown by acrossair, it was reviewed in an earlier post HERE.

layarLondonTube01.QS6OcUQasLIL.jpg
Image by urbanTick – Screenshots, London Tube, not as nice as Nearest Tube, but with additional information that it links to.

layarPanoramio01.YicpvlzwNjNN.jpg
Image by urbanTick – Screenshots, Panoramio as the Layar layer, link page and panorama on the web.

Read More

CoMob is an iPhone GPS tracking application developed at Edinburgh College of Art in collaboration with Edinburgh University. “The CoMob iPhone Application was developed as part of a research project exploring the creative use of collaborative GPS mapping.”
It is a simple tracking application that sends the location to a customisable server. It was designed for an art project presented at ISEA2009. Some images of the event can be seen on flickr here by jensouthern. The application determines the position and ends the information to a pre configured server. The update frequency is customisable as well as the server. You can change the server and for example send the location to your own server. It does not give you a visual feedback, all you can see is numbers. The interesting data is saved on the server.

PastedGraphic.BwcYLk9ypPLw.jpg
Image from CoMob – Logos CoMob (red) and CoMob Net (blue)

The CoMob (in red) application has only recently received a sister application CoMob Net (in blue). It is built on the base of CoMob, but adds some group functionality and a visualisation using Google Maps. A group of iPhone users can use the application simultaneously and see the location of each group member on the screen. Locations are shown with a connection line between them producing shapes across the urban fabric. Usage is really simple, all you have to do is put in a user name and choose a name for the group. If joining an existing group simply type the name in the box provided and you’re linked up. Here too it is possible to customise the server to store the data.
So get your iPhone friends to come out into the streets and start mapping… Download CoMob or CoMob Net directly from iTunes here. You can then join our casa group by entering the name of the group to the settings page (lowercase and you have to hit return to verify the entry).

comob02.5HPeF4AUThTR.jpg
Image by urbanTick – Screenshots CoMob Net

Read More

A brilliant Timelaps, stop motion, tilt shift, tracking movie. Actually it is an advertisement clip for a internet platform maptype.com. Unfortunately it is all in russian language so I am not on top of the information, I simply like it. It looks like some event site, with symbols for party locations, museums, theater venues, universities, shopping, eating and cinema locations. I assume the tracking in the clip refers to people going to these locations.

MapType from MapType on Vimeo.

Read More

When have you gotten lost for the last time? It must have been a while. The art of getting lost has got lost itself nowadays. The sense of not knowing the exact direction to a familiar object, place or location can be very unpleasant. On the other hand it can be very reliving. If you are prepared to accept that you have lost control over the situation or at least the location you might find yourself enjoying it.
The idea of stoling through the city, not directed by a specific destination is a concept introduced by the Situationists. The aimless wandering or derive, as it is called in the Situationist writing, can even be a method to observe the city.
However, people also get lost not on purpose. The marketing campaign of a number of companies make us aware of a lot of possibilities we could get lost and with this fuel a lot of people’s fears of the immediate surrounding. In car navigation has become the number one gadget in car sales, it has overtaken the air condition feature or the CD player.


Image by Fischer Portugal for Honda / promoting Honda’s Compact Navigation System.

People seem to enjoy being talked through the environment, and then it all depends on the voice. I assume gadget developers put a lot of thought into the voices they offer as the direction instructor. Even how it is said must be important. In a recent interview Bob Dylan has announced that he is in talks with GPS manufacturers to lend his voice for a next generation of Gadget. Click here for a sample of his voice. I am still waiting for the vice over that starts shouting at someone who just missed the turn for the fourth time. “You twat, can’t you follow instructions! I said turn LEFT!” The other way round, people shouting at the in-car-navigation-system are probably quite common.
The BBC has recently collected a number of stories of people getting lost with the GPS. Due to a software fault : ) the GPS will not correct your spelling mistakes. And it seems that people quite often misspell their destination. And a little knowledge is still needed to distinguish between Capri and Capri, as a Swedish couple have learned after they arrived in the Industrial Town of Capri instead of the island Capri in Italy. via GPSCity

GPS from DustFilms on Vimeo.

Read More

In the context of the Small World time Lapse series I was obviously interested in what else is going on in this field of panoramic photography. Just by chance I also came across new smart camera cars in the neighborhood. I approached them and we had a chat about their work.
They were expecting me to ask about Google Street View. They responded by apologizing for not working for Google and it turned out they work for the London based company 360viewmax (it was printed in rather big letter all over the small car) and they are doing a job for Islington council. It appears that the council has discovered the value of Street View for their purpose. They want to use it for maintenance survey. What that is I haven’t really figured out.
How it works is quit complicated at it involves two people in the car. There is a secondary quite big writing on the back of the small car: “Caution this vehicle stops frequently“. Meaning what it says, the car stops every 20 meters or so to take a picture. It is kind of done manually. Beside the driver the second person in the car has a laptop with GIS information on a map. The location of the image is, I believe manually input into the GIS system. GPS as they have told me is only used for rough navigation as they say it is not accurate enough. Compared to this the Google cars just drive along the road and take photographs on the go. The argument of 360viewmax is that they want to deliver high quality images with a lot of detail. The installation on the roof of the car is three Nikon p6000 cameras. Funny enough the cameras have a built in GPS module but it is not use.
However, there is a cool demonstration of it on the 360viewmax webpage (I had some issues with Firefox this morning when I tried it, but it worked on Safari). You can click into an Islington neighborhood and down to street level to jump into bubbles of 360 panoramas. The interface is rather crude and located somewhere in a GIS technical engineer kind of world. Maybe they develop at some point a neat designed consumer interface.

360viewmaxExamples.CLw81rI7Mtij.jpg
Images by 360viewmax – screenshot – plan overview, panorama, zoomed in on a car

There has been this huge debate about privacy around Google Street View and they where forced to blur faces and number plates. In this council version of Street View however these elements are not blurred and number plates can be read for example.
In terms of Google Street View, it has sparked a lot of controversy, especially around the launch of it in a new area. I remember the fuzz about it in London for a week, when it first launched earlier this year. And just a month ago the launch in Switzerland sparked the same discussion. Now in London there is hardly any comment on it in the news, apart from the odd use of the service to visualize a location. Also in everyday conversation the fear for losing privacy has been replaced by curiosity and acknowledgment. People speak about it as a useful tool, mainly saying: it is great to see a location that you are not at. Then they bring the excuse of planning for a journey and it would help to orientate in unfamiliar surrounding. We’ll it might do but come on it does not really replace being there. It is related to the phenomenon of the photograph and the discussion of truth. In general photographs are believed to be a true image of reality and therefore Google Street View is in this view a digital replication of the actual scenery at this location. So it urges the question whether it is live and people can be seen, because people identify with it so intensively that it becomes a virtual reality.
However if you are interested to know where the real Google Street View cars drive a t the moment Google has finally disclosed this information. Not in detail, but you get an idea what areas are getting mapped at the moment and the chances are that you come across a Google camera car. You can click here.

Read More

TheMonumentProject02.IjzNJs6pnUwo.jpgTheMonumentProject04.X6OdhYw4BUKP.jpg

TheMonumentProject03.nTBbafc0T996.jpg
Images by The monument View Project – Screen shots on 2009-08-11

Looking back at the London Small World clip I produced a few weeks ago, there is some contextual stuff that should be published along side.
One such project is the London Monument View. It is quite simply what the title suggest and in short the 365/24/7 version of London Small World. It is a camera with a 360 degree lens that is installed on top of Monument in London. It gives a live webcam image and also a previous day time lapse.
It is an art project by Chris Meigh-Andrews installed in 2008 during the renovation of the Monument. The idea is to process the images according to environmental data. In detail this means the orientation of the images corresponds with the wind direction, the air temperature influences the colour tone and the wind speed the speed of the image stream.
The construction on top of the monument looks like this; funny enough the glass jar in the middle is the actual lens case, so quite small and the weather station taking a lot of space.

PastedGraphic.scg3nhOfXSMC.jpgPastedGraphic3.c0FoW0tikpHm.jpgPastedGraphic1.fYZwub0Oo3u1.jpgImages by Chris Meight-Andrews – Finished installation, environmental sensors, lens VR360

The environmental sensing equipment is the same as Andy Hudson-Smith over at digitalurban uses. He has a live page that also works on the iPhone.
To see today’s Monument panorama go here there is also a log book where you can access any data in 2009. If you are interested in today’s time lapse click here you probably have to wait a second for the clip to load.

Read More

A new application is available for the iPhone and other mobile devices that provides traffic information (mainly aiming at individual car traffic) and at the same time records traffic conditions to update the information.
This commercial application is called WAZE and is at the moment only available in the states and Israel it seems. It is developed by Ehud Shabtai, Uri Levine and Amir Shinar. It is one of the first truly crowed sourced applications. The user data from GPS will automatically generate a live map. If the user moves slowly it will show as a red trail on the road and others can see that there might be a traffic jam. In addition users can also upload detailed information such a speed cam or accident and even record additional roads that do not yet appear on the map. A guided tour with comments can be found here. The WAZE fan page on facebook is here.

There are two questions that I allow to ask. One is the obvious question of how to verify the user generated data. Who can be trusted and who might just play whit he application. For example if I would be using it (I don’t own a car) and log data while walking there would be red roads where ever I go. So is there some sort of filtering and overriding feature built into the automatic live mapping?
The second question is one that I already ask myself while reading about the MIT user and especially mobile phone focused research in Carlo Ratti’s SENSABLECity team were they also claim for products that help individual car drivers to find better ways through the cit. Why would we want to develop and use something as old fashioned a this? Individual car traffic so 1920. And in this century still being trapped in this discussion about being fast and powerful and independent and so on is a bit sad actually. I don’t believe this can provide us with a global solution and is just another attempt to strengthen individual needs in a struggling urban environment.
But in this case a technologically advanced cool gadget for the cool gadget you already own so why not using it!

The MIT project is a bit older and called CarTel. It is a bit more complicated but essentially works the same way. The iPhone and Android option obviously is pretty hot. The graphic is a bit too playful and child like for my test and could be a tick more formal ad serious but there you go.

Image from WAZW.com – live map

Thanks to gisagents

Read More

As promised with the last post on Google’s Latitude, I spend some more time on other options. And actually it can be said up front; Latitude is boring whereas other applications can be very exciting.
Sorry, I had to mention this.As discussed in a comment last week Latitude is probably not meant to be cool. I now understand it more as an additional data service Google provides. A service that especially targets a new market of location based information. I assume Google plans to get people to use it, but then to involve third party companies to “use” the location data to target them specifically. This will most probably include Google itself, for ad placements forexample.
Any way this is only speculation and others might be more of experts in this field. There is a huge discussion on this topic, including some horrific stories about privacy and stuff.
But this was about other options for location based interaction. From the iPhone based tracking, the step towards a web based tracking is not far and the set of additional options is enormous. Only starting from a simple message or chat tool right up to location based tags and content such as photographs. The limitations of gadget based tracking are obvious, it is as if you are talking to your self, a rather introverted and singular recording of spatial movement. The web based option on the other hand offers instant updated and interaction.
I have been testing Brightkite and MapMe the last few days and I am just blown away. Not necessarily with the interface, the options or the features, but more by what a location based social networking tool could be. Facebook is so 1957 compared to this. The exciting thing is probably that you can take it with you and that where you are actually influences what you see, on the little screen of course. On the other end the information you ad to the network has this same dimension too. So you get actually quite easily in touch with new people, if used on a mobile device, because you constantly come across in real space other peoples digital junk (positive).

Image by UrbanTick – Screenshot history page with timeline on the top

But to start from the beginning, how does it work, what can you do on how does it feel? First we look at the MapMe application. It is developed by John McKerrell. It is a place to store your location and share it with friends. Like Latitude it has a main page on which it shows your location on a map. This map is based on Open Street Map data. A big awful yellow marker has written on it “I am here”. Maybe “ME” would do it as the service is called mapME? The big problem is the colour full approach of the open street map. It makes it really hard, if not impossible to actually see the location dots other than the big yellow box. Have a try on the image above, can you spot the greenish-brown dots? At least in London this is the case, because it is so dense. Somehow the colours on MapMe appear brighter than on the original OSM page.
A number of sources can be used to feed the location into the application. Through email with FireEagle, Twitter, Latitude, RSS feed or InstaMapper. This variety is great, although some seem rather crude. Like email, but then you think, there might be some devices that update positions via SMS or email, if they are not based on the rather new concept of free unlimited data access, so yes, great option.
The second cool add-on here is the timeline, hidden in the history tab. It makes the past locations accessible in a timeline. It is based on the Smile timeline code on Google Code. It is an interface based on horizontal bands that each are based on time units. One is the year, then the month and then the day, even the hour can be added. By pulling the bands one can navigate in time. The location points are then displayed on both, the band (as dots or lines) and on the map. The two stay in sync while moving through time. Brilliant feature. This is probably the first feature you will miss on Latitude!


Image by UrbanTick – Screenshot MapMe

That’s about it on MapMe. Unfortunately I have not been able to find any of my friends on this network, as it only allows you to search by username and if you don’t know, you don’t know. So if you are on MapMe please ad me as a contact! Was just looking for a direct link to my profile, but could not find anything so search for UrbanTick.
The link page is actually the history page. So here is my link then – UrbanTick.
It is really not so much of a socializing tool as a personal recorder, for witch it works brilliantly. It actually offers and developer API to add to the existing application and also lets you access the recorded data. Information about this is on the mapme blog.

If we move over to Brightkite this is completely different. It is a fully grown social networking tool. It is like facebook having attached a different design. Surprisingly there is no map! Not that facebook would have one, but if the service is location based the first thing to think about probably is a map. In the discussion board, what a surprise, there is a tread about this and the reply by Martin May one year ago was “That’s coming…the map is kinda clunky right now. We have great plans for it, but it will take us some time to get everything in…it’s beta, after all.” So there is still no map and it is still Beta, but it is still cool. You know maybe not having a map makes it more interesting. On the iPhone I have to say, there is the option to click on things and it would open the location in the maps application. There is actually the same button for the webtool. A map can be accessed through an individual post or location. It even embeds Google Street View to give you an image of the location beside the post.
Having said that there is one really cool feature that almost makes up for the missing map. It is possible to export the posted contents as a kml file to Google Earth or link it as a RSS feed. An it includes not only your stuff, but your friend’s posts as well, great. Guess you could simply put that feed into the yahoo pipes and have it on a map.
The really big thing here is it the location based information that you can access contend through. You can literally run into a comment or an image! The information filter is not only based on your friend network but also on the location, close 920m), block (200m), neighborhood (2km), area (4km), city (10km), metro (50km), region (100km).


Image by UrbanTick – Screenshot Brightkite web app distance filter

This becomes really interesting if we take the aspect of time into account. I thought about this when I posted a random picture of something I simply had in front of my lens, a construction site on the road. Now I am able to look at images other people have posted in the same location from before the construction started and people will pas by this location in the future and see my image of the building site even though the construction has long finished. Meaning that it builds op an immensely rich database of location based everyday information over location and time. A similar thing is the mobile flickr “around me” service. If you use flickr on a mobile device it will give you the option to filter contend based on your location, it is cool, but does not offer the control of Brightkite.
A specification of this is the save a location tool, where you can mark a location as special. It is a place mark and can be used to tag a restaurant for example. If you write a review or only leave a note how the meal was others can pick it up.
The iPhone app can be downloaded for free and is a must have. It is simple but offers a lot of features. There seems to be an issue with the bottom line links. On my phone the first instance shows two icons on top of each other but only one can be accessed. The “request“ button is somehow behind the ”I am …“ button after I clicked on the ”more” tab.
So again if you are on Brightkite give me a shout!


Image by UrbanTick – Brightkite for iPhone application screen shorts

The only problem with these tools, applications and software really is the real space experience. I found myself in the last few days sunken into my iPhone and being kind of absent from the environment around me. Although I was in a way deeply involved in the here and now, the past and other users experience of the same place I would have sensed. My experience was not too different as looking at Google Street View from a remote location. A rather dull and emotion less consumption of something that is being sold to me as a real location while being a bunch of pixels.
It has a lot of qualities and interesting aspects hat are not yet explored to the limit, but there is a down side to it as that the mobile use takes you out of the real world into the pixel world and vie versa while the benefit is not quite clear.

Read More

I recently put up a blog post about CirtySensing and ever since the topic is following me around town. Not only because of all the potential sensors I m carrying around with me, but probably also because I am more aware of the topic. I think the topic in general is closely related to the perception of space and in this sense to the mental map we all construct of the space we navigate. Our body senses are usually on high alert while walking down the road and the environment is constantly assessed. From the uneven pavement we adjust our balance, with our ears we can hear the squirrel in the tree above us, we can smell the oil and dust from the building site on the road, we see the red van on the crossroad ahead. To only list the senses that are “official” senses. Probably there is also a sense of some more embodied information such as mussels providing a sense of force and speed, the breath and the heart beat as an indicator of effort or the information about balance and body parts orientation. In short there is a lot of information.
For now I guess the technical sensing is probably simpler to describe, as the processing of the data into information is done by a chip and we can tell the chip what the output should be so it looks like a more straight forward exercise. The economist has put together an extensive list of sensing projects and its potential.
Never the less there are some really exciting technical CitySensing projects out there. For example a cooperation of five Universities (Imperial College, Cambridge, Leeds, Newcastle and Southampton) on the MESAGE project has investigated the use of mobile sensors in urban environments and a variety of applications. A short clip shows a visualization of the collected data. In an interview for “The naked Scientist” on BBC the researchers explain about the potential of the project and pod cast transcript can be found here.

CitySensing_cambridge01.2aDiTHVy2HC5.jpgCitySensing_cambridge02.q3xMQ9etghLH.jpg
Images from CamMobSens – Pollution monitored by pedestrians and cyclists with mobile devices sent directly to a website.

In Berlin, Germany scientist are testing a network of sensors that are installed in buses. A BBC documentation can be found here. The sensors cover the usual air and road temperature as well as humidity, pollution indicators, some cameras and of course GPS. So traffic information can be calculated. The data is wirelessly transmitted to a processing centre. A project website can be found here.
As a more everyday gadget based project the pathintelligence project is quite interesting. It is developed to locate the users of mobile phones and aimed at retail and shopping centers. The system is detecting the unique signal of each phone and can locate it with about 1-2m accuracy. The shoppers are tracked with a number of static sensors and the data is then used to derive information about flows and preferences of visitors. A demo can be seen here. For shopping centers there is a lot of pressure and competition so they are probably very willing customers for this kind of information. It is partly about offering a better service, but also about internal competition between the brands. For example the tenancy mix but also the optimization of rental costs are listed as benefits. Surprisingly this is only discussed in research circles and shoppers are largely unaware of the monitoring process. The Times had an article on the topic, which was then picked up by the spy blog.

footpath.5aCfmgZ7ewbe.jpg
Image by pathintelligence – screenshot of the data viualisation software

A pretty amazing CitySensing project is the sensity work by Stanza. The artist himself describes the project as “An artwork and visualization using data from around the environment. A wireless sensor network show emergent space as social sculpture”. The sensors used can monitor temperature, sounds, noise, light, vibration, humidity, and have a built in GPS unit.
These dynamic visualization scapes have been on show around the world and usually a show leads to another record, as the artist never travels without his equipment. So from London over Copenhagen to Paris and Texas to San Paulo the cities are sensed by stanza.

stanzaCopenhagen.t1txa8jKcm93.jpg
Image by stanza – sensing Copenhagen KLICK ON IMAGE FOR VISUALIZATION

stanzaSenor.cMhzOY2XktbG.jpg
Image by stanza – This mote is a MTS420 CC from Xbo without the GPS attached.and running in low power mode.

A more of a web 2.0 project relying on crowd sourcing is the lhrNOISEmap project by Ian Tout. He is currently finishing his masters in Geographical Information Science (GISc) at Birkbeck College. He is mapping the aircraft noise produced by an airplane approaching or leaving London Heathrow Airport. For this he has built an online map based on Open Street Map and uses the web platform AudioBoo and their free iPhone application to record airplane noise in London. The short clips can then be mapped, as they are automatically geo referenced. In a second step the data will be aggregated and the noise levels should appear on the map as a layer.
So if you have an iPhone and are somewhere under the flight path of London Heathrow give it a try and participate in this mapping project. A simple step-by-step guide can be found here. You can also follow the project on twitter.

lhrNOISEmap.qjUPLcs5lkQz.jpg
​Image by UrbanTIck – screenshot lhrNOISEmap project

Read More

Together with the GPS tracking technology also a whole bunch of other sensors are now available in rather small format, cheep prize and can easily be combined. So sensing the environment in a small scale is becoming possible, even popular.
A number of projects are under way. Here I put together some example.
This sort of information is especially interesting to learn more about microclimates. The knowledge regarding fine scale environmental information in cities is relatively low. With the now widely available technology it becomes possible to sense and record the environment as a pedestrian, or a cyclist. This in turn could collect the data to generate a better picture of microclimates.
Mobile phones as electronic devices that a large number of people are carrying around daily could become potentially sensors and record and transmit environmental related information in a large scale.
Research that develops prototypes for this kind of data collection is undertaken at Carnegie Mellon’s Human-Computer Interaction Institute by Eric Paulos. “How would it change your ideas about moving around in the world, if you could suddenly sense things you couldn’t see?” he asks. As a respond to this work some Phone manufacturer have already expressed interest, as he reports in the seed magazine.

Probably a good element for DIY made sensors is the Ardurino open source platform, software and hardware. “Arduino is an open-source electronics prototyping platform based on flexible, easy-to-use hardware and software. It’s intended for artists, designers, hobbyists, and anyone interested in creating interactive objects or environments.” (From Ardurino.cc)

An environmental sensing project runs in Paris. It is called “la montre verte” and is so far about a “green watch“. It grew out of the idea to mobilize the 1000 fixed environmental sensors around Paris and generate more accurate real time data. So far 30 prototypes of the green watch have been produced and are tested at the moment in Paris.
The team has produced some beautiful visualization from the collected data. It is built on a Google Map with a detailed interactive interface to select and replay the collected data.

Picture2.SqPNMTPgJFa3.jpg
Image from la montre verte

CamMobSens (Cambridge Mobile Urban Sensing) also works on a sensing project similar to the Paris project. So far they have collected data around Cambridge.

PastedGraphic.KROd0THYfvQe.jpg
Image from CamMobSense

A short clip of the data can be seen here, a paper has been published on the project.

Nokia is very active and always experimenting with new technologies. Of course they are also developing something related to the topic of extended environmental sensors. They have a dedicated project webpage on http://www.nokia.com/corporate-responsibility/environment. And of course there are also products, not yet ready. It is on the nokia page described as: ”The concept consists of two parts – a wearable sensor unit which can sense and analyze your environment, health, and local weather conditions, and a dedicated mobile phone. The sensor unit will be worn on a wrist or neck strap made from solar cells that provide power to the sensors. NFC (near field communication) technology will relay information by touch from the sensors to the phone or to or to other devices that support NFC technology.“ Nokia’s eco sensor concept:

PastedGraphic1.Clu8LTSysBkt.jpg
Image from nokia

Integrating environmental live data into further digital development on the computer, on this are the people from pachub working. They have developed a plug in for Sketch up to use live sensore information to feed into the SketchUp platform. Information on it i on their blog.

Pachube2SketchUp: plug in realtime sensor & environment data from Pachube on Vimeo.

Read More