A.R.T.I. is my capstone project. My team of seven individuals (3 developers and 4 designers) at the Rochester Institute of Technology created it over two quarters (20 weeks) in the Fall of 2011. This “team project” as it’s called, brought the designers together with the developers to create something awesome.
The first ten weeks were spent brainstorming, creating mind maps, building personas, and writing scenarios to funnel our ideas into one concise idea. We decided that art can lack a personal connection for individuals who merely possess it. Our solution aims to produce a personal, expressive and visually appealing representation of who you are based on your body’s biometric data.
The next ten weeks were mostly spent on development of our idea. Myself, Mike Higham, and Andrew Kiproff (the devs) wanted to challenge ourselves. We had all used MySQL to store data and PHP to access it, but we wanted to step outside our comfort zone. We researched, learned, and implemented Redis, node.js, socket.io, the Kinect, and other cool, new technologies to create A.R.T.I.
A.R.T.I. generates art using the user’s biometric data. We used Microsoft’s Kinect to capture the user’s data with the Kinect SDK and C#. Most (though not all) of this data is then averaged and then stored in a Redis database. Using Node.js, Express, and Jade, I created an online gallery for people to view the art after its creation. Express handled the routes and requests and the passed off the appropriate data to Jade, which created the views using the templates I wrote. The final step was to have Processing use the user’s data to create the art.
I also made use of media queries for a responsive design. View A.R.T.I. on a tablet or smartphone for a customized experience!