Geek gazette spring-2011

Page 1

Student Chapter IIT Roorkee


UNLOCKING DESIGN POTENTIAL

AUTHORIZED TRAINING CENTRE OF PTC USA

AUTOCAD, CATIA, PRO-E W5.0 MECHANICAL COURSES:- ANSYS, CNC CODE GENERATION

ELECTRICAL COURSES:-

ELECTRICAL AUTOCAD, MATLAB PLC & SCADA

MATLAB, PCB DESIGN ELECTRONICS COURSES:- EMBEDDED SYSTEM

CIVIL/ARCH. COURSES

CIVIL AUTOCAD, STADD PRO :-SAP, IMPRESSION

CS/IT COURSES

JAVA, ORACLE, .NET, C, C++ :- HTML, DHTML, ASP.NET,C#

HEAD OFFICE:- 2ND FLOOR, ROORKEE TALKIES COMPLES, ROORKEE BRANCHES :- DEHRADUN, MEERUT & JAGDHARI(HARYANA) WEBSITE :- www.cadarena.org EMAIL ADD. :- engineers@cadarena.org, cadarena09@yahoo.com contact no. :- 9219401273, 9997703829, 9690445125




Branch Counsellor: Prof. S N Sinha President: Arnav Thakur Vice-President: Ankur Agrrawal Finance Heads: Gunjan Sharma Ghanshyam Verma Design Head: Chandranshu Garg

AT

Executive Editors: Gaurav Jain Vikesh Khanna

GEEK_GAZETTE

TEAM

SPRING

News and Editorial: Mayank Garg Rose Ketty Tete Abhay Gupta Krati Verma Mukul Kumar Jain Rahul Singh Ranjan Kumar Shashank Shekhar Shristi Dohre Siddharth Bathla Tushar Gupta Rishabh Sharma Soumitr Pandey Mohit Garg Amish Bedi Vikram Singh Rathore Ankush Pumba

Finance: Rajat Gupta Abhanshu Gupta Nitin Nandwani Prakul Gupta Shagun Akarsh Somya Singhal Abhilash Jajoo Amay Sahasrabuddhe Apurv Srivastava Devang Roongta Jitendra singh Navneet Goel Pallav Anand Pravesh Jain Priyanshi Goyal Rishabh Jain Saurabh Wagh Sameer Rastogi Design: Shubham Jaiswal Peeyush Goyal Putta Ramakanth Tushar Mehndiratta Rahul Modi Shubham Mittal Lokesh Basu Sumit Kumar Chirag Kothari

2011

(c) 2011 by IEEE Student Chapter, IIT Roorkee

All rights reserved. This magazine is meant for Free Distribution and no part thereof should be sold at any price without the prior permission of IEEE Student Chapter, IIT Roorkee.


C

hange is the only constant when it comes to life in the 21st century. New ideas and novel innovations form the driving forces for progress and the harbingers of gradual transformations that possess the potential to metamorphose the face of the Earth. One might wonder what changes could sweep through IIT Roorkee in the years to come, and how this glorious institution would appear a few decades hence. In this issue, we strive to envision the future through the instrument of imagination, incorporating the effects of contemporary as well as emerging technologies that constitute the vanguard of cutting edge researches in science and engineering – some of which have made their way into the pages of this gazette. And from what we can conjure, it looks quite extraordinary. It is also our earnest endeavour to propel GeekGazette from being just another technical magazine towards achieving the higher motive of inciting a zeal for engineering and technology amongst our readers, and catering to their quest for knowledge by seeking answers to the queries they pose to us. We have therefore, added a few more elements to our repertoire, including 'Rendezvous', our own 'Campus Trends' survey and 'Penrose Sudoku', as also a few little nuggets of wisdom to enlighten the reader with facts hitherto obscured, or perhaps even unknown. This issue also features articles and information encompassing diverse fields such as Green Buildings, Game development, Quantum Physics and Network Security. The timeline presents the reader with an insight into the evolution of photography over the ages. The cover story focuses on the most sinister cyber attacks till date that have threatened to penetrate and demolish secure computer networks by means of malicious viruses and corrupt programs. For avid gaming enthusiasts, the genesis of our beloved Mario has been described in brief. A compendium of other interesting articles has been chosen to stimulate the reader's intellectual nerve. On a slightly different note, the journey thus far has been a titillating learning experience, and a memorable one at that. It is time to pass the baton of administration to the next echelon of Geeks. It is with them that the onus to negotiate the bumpy, winding road ahead lies. We thank our mentor Dr. S N Sinha for his continued patronage, and acknowledge the consistent support of our readers. We also encourage the latter to direct their valuable feedback, suggestions and comments to ieeegeekgazette@gmail.com. We leave with this famous adage from Buzz Light Year, with a vision for GeekGazette to achieve unprecedented heights:

To Infinity and Beyond ! Page 4 - Spring 2011

Geek Gazette


GREEN FOUNDATIONS

the thyagaraj complex

“Go Green” appears to be the safety refuge in these uncertain times of rising oil prices and an impending energy-crises. Everybody, right from NGOs and environmental organizations to nation-states, seems to have cozied up to the idea of “Green Technology” and is doing its bit by adopting eco-friendly techniques that minimally impact our surroundings. One of the various interesting concepts and technologies that have emerged as a result of these efforts is the concept of Green Buildings. A Green Building can be described as a structure that aims to cut down on its carbon footprint. This is achieved by maximizing utilization of available resources or energy by harnessing natural endowments such as sunlight and rain water. Interestingly, this concept has been successfully implemented in the development of sports stadia across the world. The Thyagaraj Complex in Delhi, popularly known as the Commonwealth Games building - constructed prior to the commencement of the 19th Commonwealth Games, is India's first completely green sports complex. The design of the stadium has been hugely appreciated, largely due to the employment of novel technologies never used before in the nation's legion of civil structures. Going by the reviews that Thyagaraj Complex has received, it appears that efforts to minimize the carbon footprint o f this stadium have been largely successful. Sprawling over an area of 16,000 square meters and valuated at about INR 300 Crores, this complex, shaped like a cube, has been designed by an Australian firm - Peddle Thorp. The stadium has been built

using fly ash bricks- a concept that has been used for the first time in our country. It also boasts of water conservation structures including a water recycling plant and a rainwater harvesting system which makes efficient use of the Delhi monsoon. The power supply of this stadium too, is completely independent. Power is generated by a turbine rated at 3.5MW which is run using Piped Natural Gas. The turbine has been fabricated in the United States and assembled in Switzerland at a cost of INR 19 Crores. Solar energy has also been harnessed on a large scale in the stadium. Approximately 1 MW of electrical power is generated solely by solar panels built into the roof (107m x 185m) of the complex. The complex has specialized double glazed glass frames that permit light transmission but inhibit transmission of heat. This provides for a natural ambience, thereby ensuring better playing environments. The brick walls consist of several cavities which act as thermal and acoustic barriers. As yet another first, the administration has received the coveted Gold rating from Leadership in Energy & Environmental Design (LEED) - an internationally recognized green building certification system, for the complex and might as well convert it into Platinum rating very soon. Following the lead of Thyagaraj complex, several stadiums are being planned across the country along similar lines. Housing similar technologies in the design of places where we live and play could be the next big thing in the times to come. We anticipate a future where a Green Building like the one described above is the order of the day rather than a marvelous wonder.


R E N D E Z V O U S

DC++:SDS Labs

I

t is a painful irony indeed, that despite being one of the premier technology institutes of the country, IIT Roorkee is made to starve for something as mundane as Internet connectivity. However, the launch of DC++ - a file sharing service, by the newly formed SDSLabs, Hobbies Club, promises to bring at least some measure of relief to the"eternally disconnected" junta of IITR. Geek Gazette recently caught up with Shobhit Singh ( C S I , I V Ye a r ) , Founder and P r e s i d e n t – SDSLabs, to talk about DC++ and its implications for the aam aadmi. Here are some excerpts from the interview. GG: What is DC++? What does it have to offer to an average IITian (oxymoron?)? SS: DC++ is a local peer to peer file sharing application. In a layman's language, it allows anyone to search and download stuff shared by anyone on the entire IITR intranet LAN. The data resides on one's own computer, not on any server. Being on intranet, it is quite fast and since everyone can share anything, it grows constantly. GG: Where did the idea of having DC++ in IITR come from? Any particular source of inspiration that we should know of? SS: I am from Kanpur, and on seeing people use DC++ at IIT Kanpur frequently, I felt the need for same here too. Why should IITR lag behind in technology? GG: Were there any hiccups in launching the service on campus? How did the administration react to the idea? SS: Our network structure is quite different from other institutes as we have WiFi (other IITs have wired LAN). In addition, network architecture here is different, implemented locally at the hostel level. We had a fair share of problems in

Page 6 - Spring 2011

its setup, but then we patiently solved them one by one to make it successful. In general, DC++ by itself does not support a network like ours because of NAT (Network Address Translation). Due to this (NAT), users from one hostel cannot connect to users of other hostels directly. So we figured out an innovative workaround to create a sort of bridge between our network architecture and the network setup which DC++ needs, without sacrificing throughput. We built two pieces of software - one running on the Bhawan gateways and the other running on the end users' computer to achieve this. The administration was very supportive of the idea from the very beginning. They had been interested in implementing DC++ for two years but because of the technical difficulties involved in its setup, it wasn't established earlier by anyone else. We are thankful to Dr. N.K. Goel, Dr. Padam Kumar and Mr. Naveen Shukla for all the help and support they gave us in our endeavor. GG: DC++ is currently accessible only in 4 Bhawans (Sarojini, Govind, Azad, Ravindra). How long before everyone comes under the umbrella? SS: It will be coming very soon to all the Bhawans. We can launch it even right now but the Information Superhighway Centre wants us to be sure of its load in these Bhawans before we move on to others. It is sure of its usage now, and soon we will put DC++ up on all the Bhawans, including all CCs and the library. Even the Saharanpur campus will be connected. GG: Lately, there have been some complaints about difficulty in installing DC++ on Windows? Any word of advice for such people? SS: There were some problems initially, mainly because people are not used to it (Configuration is bit long and most people tend to skip certain steps). Also, as one moves from the web to the software side, things get a little more complex, given different Operating Systems and architectures. And to be honest, DC++ is still in the Beta phase. Nevertheless, we have corrected most of the bugs reported by people. An advice to all - Please follow the help file line by line. Do not miss anything. It is just a one time

Geek Gazette


process. All those who faced problems earlier, we request them to download the new service and also try the troubleshooting options displayed on the DC page. If you are still unlucky, contact us on the DC Main chat, mail us or come to the lab. We will soon be sending our teams to the Bhawans on specific days to deal with the DC problems there. GG: Some concerns have been raised about the DC++ service consuming whatever little bandwidth students get for Internet usage, thus effectively worsening the situation rather than improving it ? How would you then, justify DC++? SS: Just keeping yourself connected to DC++ does not consume any bandwidth. The transfer takes place only while downloading and that too is peer to peer so it does not affect the central bandwidth usage. We have done a lot of research and monitoring to make sure that it does not deteriorate normal Internet usage at all. After a lot of testing, it was found even by ISC that DC++ does not affect the network speed or bandwidth because the local intranet is not as much constrained as the Internet. Also, we asked ISC to replace/upgrade the gateways wherever we found them not good enough to support DC++. This was the reason for initially leaving out the lower- region hostels (RJB, RKB etc.). Please be sure of this, that in case DC++ starts affecting network usage, we would be looking into the matter even before you notice it.

GG: When can we expect a similar service for the Linux lovers? (The service is currently available only for Windows platform.) SS: The Linux service is almost ready and tested. We will be releasing it along with the launch of DC++ in rest of the Bhawans. We did not release it earlier because we wanted to keep the number of users limited plus it was in the testing phase only. But since most of the team members in the lab are Linux lovers, it is always a priority for us. Macs will be supported later, after Linux. GG: Any other secret plans of SDSLabs that you may wish to leak to us? SS: We normally encourage people here to work on applications that are not restricted to just web development. For example, we recently worked on providing Intranet TV, which could stream specific channels to any user on campus. Freedom to work on any idea has become a part of our working culture. That is why we organized the Syntax Error night where we asked everyone to work on any idea for twelve hours straight from 9 pm to 9 am. Some of those ideas have been taken up by us and we are seriously working on them. One of the main aim of SDSLabs is to take the on-campus technology to the next level and you will feel the change as we come out with more of our applications and services. GG: That's great! Wish you loads of luck in all your endeavors. :)

R E N D E Z V O U S


R E N D E Z V O U S

Geek-Gazette interviewed the man who led the crew - Guptanath Dash (B.Tech, 4th year, Mechanical Engineering). GG: Congratulations to your entire team for the excellent feat at BAJA SAE INDIA. Tell us more about the event. GD: BAJA is an inter-collegiate competition where students from all over the country participate in teams to craft a mini-version of an off-road buggy. This is India's largest student-event of its kind, in terms of participation. Moreover, victory at BAJA is accompanied by a handsome prize money of INR 10 lakhs.

Team Name

: KNOX, IIT-R

Rank

: 10th (BAJA SAE)

Engine

: 350 cc, 14 hp

Transmission

: Mahindra Champ

Top Speed

: 60 km/hr (limited)

Weight

: 300 kg

Braking System

: 4 disc brakes

T

eam Knox, IIT-R added another feather to its cap through its excellent performance at BAJA SAE INDIA 2011. The competition, which was held at Pithampur, Indore this year, aims to provide SAE student members with a challenging project that not only involves designing, engineering and planning, but also manufacturing and marketing. IIT Roorkee stood 10th overall among all the participating teams in the competition. The merit of this achievement can be gauged from the fact that this was IITR's first attempt and yet the best performance among all the IITs. At BAJA, the participating teams compete against each other to have their design accepted for manufacturing by a fictitious firm. Students must function as a team not only to design, build, test, promote, and run a vehicle within the constraints of the rules, but also to generate financial support for their project and manage their educational priorities. Each team's goal is to design and build a prototype of a rugged single seat, off-road recreational four-wheel vehicle which should be able to negotiate rough terrain in all types of weather without damage.

Page 8 - Spring 2011

GG: When did your team begin preparations for the event? GD: We started making the vehicle in 2009 and were finally able to participate in 2011. This is the first vehicle to have successfully represented our institute in such a competition. GG: What was the composition of your team? GD: Our team included 10 students from fourth year, 8 from third year and 7 from second year – a total of 25 members. In addition to this, we were expertly guided by our faculty advisors - Dr. R.P. Gakkhar and Dr. K. Murugesan. The vehicle was driven by Lalit Mohan (B.Tech, 4th year, Mechanical Engineering). Our endeavor met with success due to the time and efforts put in by the Mechanical and Industrial Engineering Department. We received absolutely no help from outside. GG: What was the gross expenditure on the project? GD: Around INR 2,30,000. Our sponsors were Tata Genuine Parts and Mac Lubricants. GG: Any problems that the team faced during the project? GD: The biggest problem was the lack of resources owing to Roorkee being a small town. We encountered additional shipping costs and time delays while ordering key components from major cities. There wasn't any major problem other than that. And had not the 'A-arm' (a component of the vehicle suspension) of the buggy broken off during the mainevent, we would have definitely won the 10 lakh cash prize. GG: What are your plans for next year? GD: We will recruit students from first year this time. It will be an excellent learning experience for them. Not only will it endow them with critical skills, but also make them familiar with the practical applications of their theoretical knowledge. By God's grace, we will definitely bring the trophy home next time.

Geek Gazette


Country Name : Facebook

Population : 600 million

President : Mark Zuckerberg

NEWS THAT NEVER WAS ! The constitution, said Facebook officials on conditions of anonymity, is in the development stages. It will incorporate features of American, British and Canadian constitutions considering the fact that a big chunk of the initial citizens will be from these countries. Thereafter, the constitution will be presented to the citizens in the form of a fan page to invite their comments and likes. In an exclusive interview with Geek Gazette, Zuckerberg said that his initial five year plan will include key agendas like promoting world peace (by banning applications like ‘How devilish am I’) and spreading love (by creating more applications on the lines of ‘How romantic am I’). He also said that the economy will be a free market economy aiming at benefitting the ‘common user’ with limited ‘cash’ or ‘chips’ in Farmville or Poker. Users would be able to buy onions and petrol for their Farmville and Cityville accounts at discounted prices during crises such as now. He said he aims at “creating a community where people can peacefully farm on their Farmville plots while occasionally indulging in a game of poker in the other tab.”

MARK ZUCK F

acebook, in a historic press conference a few days back, announced the launch of a new country with over 600 million active citizens. This new country will be the third highest populated, after China and India. Facebook founder Mark Zuckerberg announced in the conference that all existing members of the social networking mammoth will be offered citizenship to this new country, the name of which will be decided by the citizens themselves by conducting polls on Facebook.

Facebook users across the globe have reacted enthusiastically to this development. Al-Qaeda outfits are eager to establish their training camps in the new country. Indian, Chinese and Pakistani governments are excited about the possibilities of disputed lands for their perennial amusement. Obama has expressed dismay over the new photo viewer and has refused to comment until it is fixed. American and European janta is happy to have received free Farmville and Poker chips. Manhoman Singh, as usual, has opted to remain mum on the issue. Justin Beiber's comments on these global developments were fortunately not sought and Taylor Swift too was expectedly ignored. All in all it appears to be a wonderful proposition. People are spending longer hours on Facebook, opening random profiles to know their country mates, even waking up to the idea of democracy in remote parts of the world. Zuckerberg encouraged by these proceedings has given signs of diluting privacy policies to encourage transparent interaction. Stalking, anyone? P.S: We at Geek-Gazette promote randomness, the proof of which is amply visible above. If you think you can be even more arbitrary than us, do write to us at ieeegeekgazette@gmail.com. Your aim should be to maximize the entropy of your thoughts calculated over all the stochastic variables within domain. In short, be arap – as random as possible.


THE PHILADELPHIA EXPERIMENT trife and war have always inspired mankind to come up with radical ideas and inventions. Camouflage, the ability to merge with one's surroundings for defense or ambush, has always been priority research objective for the army. But one such project, undertaken by the US Navy in October 1943, aimed for something greater than that – INVISIBILITY, the ultimate form of camouflage.

S

The objective was to cloak a warship in ultra high strength alternating magnetic fields to bend light around it. Effectively, the light falling on the back of the warship was to be projected to the front thereby rendering it invisible. Since this bending effect would apply to all forms of electromagnetic radiation, the ship would gain both optical and radar invisibility. This idea was the brainchild of none other than Albert Einstein, who believed it to be a natural consequence of the Unified Field Theory (UFT) that he was working on. The UFT aims to unify electromagnetism with gravity. According to it, electromagnetism can bend light in the same way as gravity. And since electromagnetic force is 1036 times stronger than gravity, one should be able to achieve the bending without having to create something as humongous as a blackhole. However, the theory was never completely verified. Yet, it is believed that bigwigs like Paul Neumann, Albert Einstein and Nikola Tesla were a part of this project, sometimes also referred to as the Project Rainbow. The project began on October 28, 1943 at the Philadelphia Naval Shipyard, Pennsylvania. A dummy crew was put on board an old US Navy destroyer, USS Eldridge, with a series of instructions to carry out. The scientists and the top brass observed the entire experiment from an onshore INVISIBILITY observation deck. Both the ship hulls were coiled to the maximum limit with wires that were to carry a very large alternating current, to cloak the ship with a high frequency oscillating magnetic field. The generators on board were then fired to maximum power. As the alternating magnetic field strengthened, the observers saw the ship getting veiled by a translucent greenish haze and flicker for some time…… and in a moment, the ship disappeared, leaving not a trace behind it. USS Eldridge had achieved the supreme form of camouflage. Unfortunately, it lasted for only about 10 seconds. The ship reappeared. And now it resembled something from a science fiction movie gone terribly wrong.

Page 10 - Spring 2011

Geek Gazette


Spectators of this debacle on SS Andrew Furuseth (a military warship witness to the experiment), claim that they saw the Eldridge crew “mingled” with the ship structure. Unconfirmed reports claim that the crew members were found half buried on deck, or fused to the bulwark or impaled by the siderails. The reason being, that when exposed to such high energy density, the ship and its crew became one single mass. Their molecules got scrambled with each other, resulting in the terrible scene on deck. As might be expected, according to official reports, no such experiment was ever conducted and the the Eldridge crew members went missing or died in action against the enemy during the war. Nonetheless, the Philadelphia experiment captured the imagination of the general public and many references to it are found in popular culture. The experiment has been the subject of several television shows dealing with the paranormal and with conspiracy theories, including “The Unexplained”, “History's Mysteries”, “Vanishings!” and “Unsolved Mysteries”. The Philadelphia experiment was also the subject of a 1984 Sci-Fi movie of the same name. Numerous versions of it have since then, featured in films, books, (“The Macros”, “Green Fire”) and video games (“Doctor Who”, “Assassin's Creed”), fanning even more speculations and rumours. However, nobody knows the complete truth and the actual facts about the experiment might remain shrouded in mystery forever.


To err is human, but to really fool things up requires a computer

P

COMPUTER BLUNDERS

aul Ehrlich might have made the above remark a bit tongue-in-cheek, but the kind of blunders that computers and other high-end technology have gotten themselves involved in over the years are a recurring testimony, that whenever technology goes wrong, it often does so in a spectacular manner. Geek-Gazette brings to you some of the most bizarre and freakish tech-goofups that not only highlight our over-dependence on technology but also the inherent unreliability of everything man-made.

1. Minor calculation mistake destroys Mars Orbiter It was way back in 1999, that the US Mars Mission controllers were hoping to unearth new secrets about the red planet through their Mars Climate Orbiter. However, as it turned out, a minor computational error related to the conversion factor of unit of force (US system uses pound-force instead of the Newton – SI unit of force) was to spell doom for their plans. As the orbiter tried to enter the Martian atmosphere, instead of entering it at a high trajectory as per calculations, it forayed into it at a very low trajectory. When the boosters were fired to stabilize it, the orbiter got further pushed into the Martian atmosphere where it ended up as a dead piece of metal. Total cost of this simple mathematical error was estimated to be around 327 million USD. A hefty price indeed for a calculation mistake.

2. US hands over “Bogus Code” to Soviet Union It was at the heights of the Cold-War, that the Soviet Union was searching for a better gas-pipelining technology to transport gas to Europe. However, instead of doing their own R&D, the Soviets decided to take the shortcut instead. The plan was to quietly steal the technology being used by the US and install in its own systems – a plan that was to go awfully wrong. As it happened, the US intelligence came to know about Soviet Union's plans and decided to set up a counter-intelligence sting. US authorities installed a modified version of the software at a Canadian company which the KGB (Soviet Intelligence Agency) was targeting. The changes in the software were so expertly hidden that it passed through all inspections and tests by the Soviets. The pipelining software ultimately found place in a system responsible for transporting more than 40 billion cubic metres of gas. As expected (by the US), the software malfunctioned at the designated time, causing the whole system to go haywire. Valves, turbines turned on and off at random, pipes ruptured resulting in an explosion visible even from space.

3. Sun sparks World War III. Well, almost! Our history-books might have been a lot more different had not a certain Lt. Colonel acted on his instincts. This goes back to 1983 (again the Cold-War setting), when the USSR had just installed an early-warning system to alert them of any incoming ICBMs (Inter-Continental Ballistic Missiles). However, on the fateful day of 26th September, 1983, nature conspired, creating false-alarms in the early-warning system. The Sun,

Page 4 - Spring 2011

Geek Gazette


Infrared sensors on board the Soviet monitoring satellite, and the missile fields of the US were somehow so perfectly aligned that day, that it produced an intense glare after reflection from high-altitude clouds. The reflected sunlight was detected by the Soviet infrared sensors and the Soviet command-centre saw an ominous view on their screens. Five nuclear missiles were headed their way and it was only the gut-feeling of Lt. Colonel Stanislav Petrov that prevented him from passing this information to the higher authorities. The Lt. Colonel later recounted - “When people start a war, they don't start it with only five missiles�, explaining why he didn't report the matter to his bosses. Now that must have taken guts for sure.

4. Doomsday for Power Programmers in US: Computers are an intrinsic part of all critical industrial systems these days and when these critical systems go wrong, the consequences get multiplied manifold. This is what exactly happened on the night of 14th August, 2003 when 50 million people in US and Canada bore the brunt of a so-called race-condition bug. The bug resulted in a total blackout in 8 American states and parts of Canada, forcing people to spend their nights in complete darkness. A race condition occurs when a single resource is accessed by two threads (or processes) simultaneously. In this case, feeds from many networks created this bug which first brought down the alert system. Thus the network administrators remained unaware of what was going under the hood, until the consequences became visible to the eye. The minor problems that should have been solved on receiving warnings from the alert system remained unsolved creating load on the primary server. As a result, the primary server crashed within 30 minutes followed by the back-up server. Soon the power lines and the circuit breakers started tripping out and the whole grid network was brought to its knees within a few hours. The incident left around 256 power plants offline, making it the worst power-failure in the history of United States.

Page 13 - Spring 2011

Geek Gazette


'Change is the only constant' – The saying is perhaps best typified by the ever-evolving IT world. The pace at which trends and user-preferences change here is almost unmatched by any other industry segment. No wonder then, that hitherto unknown names just a few years back are now the stalwarts of the Web. It is this evolution that brought out Google and Facebook and it is this evolution that is making yet another name a part of our popular culture. Android - a product of Google, is fast emerging as the giant of mobile and tablet platforms, having already announced its arrival by acquiring the top spot in the mobile market. That it dislodged Nokia's Symbian OS – a market-leader for last ten years - in the process, is no small feat. It all started when Google acquired a little-known company, “Android Inc.”(Co-founded by Andy Rubin, now director of mobile platforms at Google) in July 2005. Google had been mulling over the idea of entering the mobile market for quite some time and when Rubin mooted the idea of an open source mobile platform called Android - easy for a coder to write and easy for a handset-maker to install, Google realized that this could be the perfect jumpstart for them. In a smart tactical move that was to yield rich dividends, instead of just supporting the product, Google bought out the whole company. The first Android 1.0 (codename Apple Pie) phone, HTC G1 was released in 2008. However it lacked real smart phone features and wasn't exactly able to create a stir. Since then, Android has evolved a lot and has come a long way to its present-day enviable status of the only 'viable alternative' to Apple's benchmark iOS. The launch of the first Android has been followed by a number of updates and releases, each one adding exciting new features to Android's repertoire. An interesting trivia about all these releases is that all of them have been named after some delicious, mouth-watering desserts. The next version after Apple Pie - Android 1.1, codenamed Banana Bread , was deployed to Androidpowered handsets from February 2009. This was the first software update for Android. New features included support for marquee in layouts and saving attachments from MMS. Version 1.1 also fixed the 'Force close' issues that many people were having.


With the arrival of Android 1.5 in May 2009, “Cupcake� became the new buzzword amongst the general public. Cupcake was the first major update and it gave the Android OS more finesse in many areas. The copy and paste option within the browser and videorecording were some of the new features introduced in the update. It also allowed users to upload their videos directly to YouTube. However, there were still some problems with the on screen keyboard as well as the widgets which affected user experience. Android 1.6, also known as Donut - a small update released in October of 2009 was next in chain of Android avatars. It came with a revamped search box, camera and gallery application, and a completely refreshed Android Market. This time the 'search' was intended not only for searching the web. Instead, it allowed a user to search many other places within the device like bookmarks, applications and more. In addition, it introduced a text to speech library, built-in Google translate and a handwriting recognition feature. Android 2.0, Eclair, was a great step in the evolution of Android. It debuted with Motorola Droid in November 2009. This was the second major iteration of the platform and it was with Android 2.0 that Android emerged as a potent competitor in mobile segment. Applications were now able to synchronize with contacts from Facebook, Twitter and other social networking sites. This update also included new camera features like digital zoom, color effects and macro focus. Virtual onscreen keyboard was also improved. However, the single most important feature that caught everybody's attention and tilted the scales in favour of Eclair was perhaps the Google Maps turn-by-turn GPS navigation service that was offered for free. Android 2.1 marked the second stage of Eclair's evolution with the release of Nexus One. 3D effects and animated wallpapers were also introduced in this version. Nexus One was also the first Android handset that gave users the option to translate speech into text in any text field. Apart from these updates, the photo gallery was given a major 3D revamp. Cooliris, a US corporation, helped Google develop this gallery. This update also added multi-touch functionality which worked butter-smooth with Android handsets.

Page 15 - Spring 2011

Android 2.2, Froyo, came with JIT (Just In Time) compiler, and the JavaScript V8 engine which made it more than two times faster than the previous versions. Other great features of this update were Macromedia/Adobe Flash 10.1 support and a visible decrease in the response time to launch applications. Dalvik Performance Boost was added which increased performance speedup by two to five times over Android 2.1. This update enabled users to auto complete recipients' name from the directory while using the email application. Global Address Lists look-up was made available in the Email application, enabling users to autocomplete recipient names from the directory. Enabling of LED Flash for the Camcorder made it possible to shoot high quality videos even in low light. The next update for Android came in the form of Android 2.3, Gingerbread, released in December 2010. Gingerbread was a minor release, that was later replaced with the bigger 3.0 Honeycomb release. This update included an all new onscreen keyboard to make typing faster and more intuitive. This version also came up with a new and better way to select text and copy-paste it. Android 2.3 provided better insight into the applications running in the background, the memory and CPU cycles being used, and even allowed users to kill misbehaving apps. The latest release of Android is Honeycomb (Android 3.0) that was released on 26 January, 2011. K e e p i n g w i t h t h e l e g a c y, Honeycomb has introduced many new and exciting features for users as well as for developers. Honeycomb has been basically designed for devices with larger screen sizes, particularly for tablets. It has an optimized tablet user interface which allows users to accommodate frequently used applications directly on the home screen. In this, the Android soft keyboard has also been redesigned to make entering text fast and accurate. The keys have been reshaped and new keys, like the Tab key, have been added, to provide richer and more efficient text input. Apart from this, Android 3.0 has also introduced new connectivity features and a built-in support for Media/Photo Transfer Protocol that lets users instantly sync media files with a USB-connected camera or desktop computer, without the need to mount a USB mass-storage device. With all these added capabilities, Android 3.0 appears to be the most complete and allround smartphone OS in the market right now.

Geek Gazette


What Next? Android's next platform update- dubbed 'Ice Cream sandwich' (not confirmed yet) is just around the corner. Guessing from the code name Ice-Cream, this Android release should hit the market sometime this summer, possibly in mid 2011.With the release of Android 3.0, many were assuming Ice-cream to be given version number 3.1 or 3.5 or 4.0. But it looks like Google wants Ice cream to be named Android 2.4. Speculations are rife that the next version of Android will be an updated version of Android 2.3 - Gingerbread and as a result will be more focused on the phone experience rather than on tablets. However, these are still rumors and we need to wait and watch how Google goes about serving 'Ice cream' to its users.

P.S: By the way, did you notice the alphabetical naming of the successive releases of Android?

Apple Pie- 1.0 Banana Bread-1.1 Froyo-2.2 Gingerbread-2.3

Cupcake-1.5 Donut-1.6 Eclair-2.0,2.1 Honeycomb-3.0 Ice cream Sandwich

PENROSE

In this variant called Penrose Sudoku, which is named Torch by creator Huang, only numbers 1 through 8 are used, and the rows and columns have been distorted by the irregular tile pattern. A row is a series of quadrilaterals that share parallel edges. For example, the highlighted set of eight cells (covering the givens 7, 5, 4, 3, and 6) are all in the same row. There are exactly 10 rows, each with eight cells, beginning and ending at opposite edges of the decagon, and each cell is in exactly two rows.

Solution at Page No. 34 Page 16 - Spring 2011

Geek Gazette


The Times They Are a-Changin’

Cover Story:


M

alice and genius when combined, often make for a heady cocktail. And even more so in this Information Age, where evil intent in conjunction with expert knowledge can spell doom for the targeted victims. In the past few years, with computers becoming the heads and brains of almost everything around us, cyber attacks and cyber warfare have acquired a whole new dimension. Consequently, malicious code today not only threatens virtual assets in a virtual world; instead it is breaking new grounds by targeting real-world infrastructure (Remember Die Hard 4?). In particular, some of the cyber attacks staged in the past few months have proven to be game-changers; affecting the very foundations of network-security. Geek-Gazette in this cover-story, explores this rapidly changing threatscenario through an exposition of the two biggest path-breaking events on the network-security landscape that have left an indelible mark in their wake.

Stuxnet:

Breaking barriers, stereotypes and Nuclear Power Plants

One of the most spectacular and efficient cyber attacks of all times, Stuxnet marks the arrival of a new generation of malicious code in the cyber-world. Code causing tangible damage in the real-world was till now a domain of sci-fi movies. However, that is no longer true. Stuxnet, quite simply, is the most complex and potent piece of malware seen till date. With ambitious targets like the industrial control systems (ICS) controlling gas pipelines and nuclear power plants, Stuxnet is necessarily complex. However, the basic goal is rather simple to state. Stuxnet primarily aims to reprogram the Programmable Logic Controllers (PLCs) used to operate a variety of industrial control systems (such as in Nuclear Power Plants and Space missions) and to hide these changes from the operator, sabotaging the ICS (Industrial Control System) in process. To this effect, Stuxnet employs a variety of strategies, many of them a first in malicious code history. For the record, Stuxnet is not the first computer worm to have targeted industrial control systems. However, what makes it stand out is the sheer breadth and depth of tools used by it to stage the attack. Stuxnet makes use of a plethora of components such as zero-day exploits (vulnerabilities unknown even to the software vendor), a Windows rootkit, a first ever PLC rootkit, antivirus evasion techniques, network infection routines, peer to peer updates, and a command and control interface to compromise the ICS under attack. Also, in a remarkable break-away from generally proliferating viruses and worms, Stuxnet makes use of a highly target-specific fingerprinting technology to identify victim industrial control systems. In fact, Stuxnet looks specifically for computers running Siemens SIMATIC Step 7 industrial control software

Page 18 - Spring 2011

that is widely used to program PLCs in the industry. As such high-value critical computers aren't normally connected to the Internet (for obvious security issues), Stuxnet makes use of a Microsoft vulnerability allowing auto-execution to self-replicate itself through USB flash drives. Once Stuxnet gets on to a host computer, it starts propagating on the local network by exploiting another zero-day exploit in the Microsoft Printer Spooler service (Both of these vulnerabilities have since been patched). To prevent collateral damage, Stuxnet spreads only to a maximum of three other computers from a given host. This also aids Stuxnet in stealthily spreading across the network while keeping a low profile all the time. However, as one might expect, infecting the critical hosts is just the beginning of the job for Stuxnet. After 'planting' itself across the local network surreptitiously, Stuxnet makes use of a Windows rootkit to hide its malicious binaries and utilizes two more zero-day exploits (Escalation of Privileges vulnerabilities yet to be patched by Microsoft!) to gain elevated privileges on the infected computer. Now if this infected host is found to be running the specific Step 7 software, Stuxnet secretly 'hooks' on to it using another zero-day exploit – this time in the Siemens Step 7 software- in such a way that Stuxnet executes every time Step 7 Software is loaded. By 'hooking' on to the industrial control software, Stuxnet is thus able to access any PLC attached to the infected computer via a data cable and reprogram it to malfunction the way it desires. And meanwhile, Stuxnet also installs another rootkit on the PLC (This was the first ever use of a PLC rootkit) and sends 'Everything's OK' signals back, thereby hiding its presence from the human operators of the critical systems. To top it all, Stuxnet also tries to update itself

Geek Gazette


from two specific websites (These websites have been brought down since then as a part of global effort to counter Stuxnet) set up for the purpose through any infected host connected to the Internet. These updates are then propagated to other Stuxnet copies through a peer-to-peer mechanism within a LAN, thus helping it to evolve against any counter-measures deployed to neutralise it. The end-result is the critical industrial control system getting compromised with the operators not having even a clue of it until very late. As per the statistics collected by Symantec Inc., most of the computers infected by Stuxnet (almost 60%) were found to be located in Iran. Although not supported by any direct evidence, many experts opine that Stuxnet is a creation of a nation-state (most probably Israel or US or both), considering the level of sophistication and resources needed to create and test such a worm. Stuxnet is believed to have set back Iranian nuclear ambitions by at least two years - something that even a

military strike might not have achieved - fuelling speculations that this might just trigger an entirely new kind of arms race. Arms race for 21st century cyber weapons. Stuxnet, not surprisingly, is being touted as a paradigmchanging event in the world of cyber security. It is ingenious, complex and refuses to fit into any stereotype. In fact, Stuxnet might just be the beginning of a whole new generation of malware – malware that attacks real-world assets. And although the level of sophistication achieved by Stuxnet itself serves as a guarantee against mass incidence of such attacks anytime soon, Stuxnet has definitely made everyone sit up and take notice. Computer worms sabotaging real, tangible things are no longer just a theoretical fantasy. The threat is a reality today and a stark reminder that we can never underestimate those on the 'Dark Side of the Moon'.

Operation Aurora: Fortune 500 under Fire

The year 2009 marked the advent of highly sophisticated cybercrime in the commercial sector, a phenomenon which was exclusive to defence secrets at the time. The term “Operation Aurora” is used to refer to the multiple corporate infiltrations that occurred in the second half of 2009. An inspection of the malware code on the infected machines revealed that the word “Aurora” was part of the file path on the attacker's machine, hence the fancy name. Operation Aurora, basically a 'hack attack', combined an all new level of encryption, stealth programming and a zero-day exploit in Microsoft Internet Explorer. It started with a predefined target user receiving a link in an e-mail from a trusted source. Clicking the link brought the user to a website hosted in Taiwan (or Illinois or Texas) which contained a malicious JavaScript payload. The JavaScript payload contained the zero-day IE exploit, which downloaded a binary disguised as an image file hosted in the Taiwan servers and executed the malicious payload. This payload set up a back-door SSL connection to the control and command servers located in Taiwan, which eventually compromised the system-security and gave the attackers complete access to the internal systems. The main targets of Operation Aurora were intellectual

property and source-code configuration management (SCM) servers accessible to the compromised system. The access to the SCM servers was particularly dangerous as these are used by all major companies these days to maintain and update source code for their proprietary software. In effect, access to SCM servers meant that the attackers could make hidden changes in the source-code which would result in a widely distributed malware on public release of the software. It also acted as a beachhead for further penetration into the network, and vulnerabilities in Adobe's Reader and Acrobat software (along with several others) were used to compromise other computers. In effect, the attack provided for a perfect platform to gain access to sensitive resources of reputed organizations and to use them for a large-scale distribution of malicious code. Operation Aurora was a massive scale coordinated cyber attack and is an example of a highly specialized class of attacks known as the 'Advanced Persistent Threats' (APT). APTs generally target nation-states and are often used for intelligence gathering. However, Aurora was the first instance of such an intricately planned attack in the commercial sector. Several Fortune 500 companies (Google, Adobe, Yahoo, Symantec) were attacked and several more have been



identified as potential victims of a similar attack in future. Most of the people related with the task force set up to investigate Operation Aurora have pointed their fingers at China as being the perpetrator of the crime. There have been suggestions that the Chinese government not only funded but also arranged personnel for organizing the entire digital heist. However, as is generally the case with cyber-attacks, nothing can be conclusively claimed about the origins of Operation Aurora. Like Stuxnet, this aspect of Operation Aurora remains shrouded under mystery, fuelling speculations

and rumours. However what we definitely know is that Aurora has brought the scourge of advanced security threats to the doorsteps of the corporate world. The level of sophistication exhibited in these attacks has demonstrated the capabilities of the malevolent coder and a reprise of the Operation Aurora should not be considered an unlikely event. The writing on the wall is crystal clear - The corporates can no longer be blissfully ignorant of the looming threat of living and operating in the digital world.

'First Blood' has already been drawn.

Security Scenario @ IITR: Now talking about things closer home, Geek Gazette asked a few 'wise' men on campus how they felt about the network security infrastructure at IIT Roorkee. To nobody's surprise, a grim and bleak assessment was what everyone came up with.

Divye Kapoor (5th Year, CSI) says -“It is unfortunate, but true, that the Indian concept of cyber security is still in the dark ages when compared to the rest of the world. IITR like most other Indian organizations has miles to go to ensure security - whether it be physical or electronic. The email system is an especially weak area in the institute's network. However, to its credit, IITR has tried to improve its network security with the introduction of secure WiFi access points. I am hoping that over time more innovations and best practices in security will be incorporated into the IITR network.”

Prateek Gianchandani (B.Tech, 4th Year, Electrical Engineering) paints an even more disappointing picture of the IIT Roorkee network when he mentions various loopholes and vulnerabilities of the network. “The passwords sent over Channel I are not encrypted and can be easily compromised. WPA encryption employed recently to check username/password combinations with MAC address is not secure enough, because contrary to popular belief, MAC addresses can also be spoofed. The UG office, which holds sensitive personal information of students like contact information, address, grades etc., runs Windows Server 2003 on its systems. Similarly, DNS servers in most of the hostels run old Windows Boxes which are vulnerable to many attacks. Several scripts are available on the Internet to penetrate such systems. In my opinion, to improve the security of the network here we must use fully patched Linux servers, provide physical security to access points and make the general public aware of the consequences of a security breach. That is all we can do as far as wireless security is concerned because we are constrained by the inherently flawed wireless protocols.”

Vikesh Khanna (B.Tech, 4th Year, CSE), however begs to differ (literally!) when he observes - “IIT Roorkee's network is highly secure. Because there exists no network in the first place!” Well, we guess, many won't disagree with that either.

Page 21 - Spring 2011

Geek Gazette


T

he placement season is finally over at the IITR campus, but there still remain many unanswered questions, myths and doubts regarding placements, among freshers and anxious third year undergraduates staring at the next placement season.Geek Gazette took the initiative of finding answers by interviewing professors and successful 4th years (read those who got placed in the first week itself), among others, to understand the actual scenario of placements in our campus. Read on to know more what the hullabaloo is all about, where actually you could get placed during your final year and most importantly how. To begin with, the placement team (noticed the placement office opposite the convocation hall?) comprising of 3rdand 4th years,invites companies from various walks of the corporate world, which start visiting our campus at the fall of November, or the onset of December. This year's day '0' was the 30thof November 2010. Companies offering hefty pay packages are the first to set foot in the campus, looking for the 'cream' at the top. On the basis of annual CTC (Cost to Company), these companies are classified into categories A, B and C -'A' class offering a compensation exceeding 5 lacs p.a. , 'B' class offering 3.5-5 lacs p.a., and the dreaded 'C' class offering less than 3.5 lacs p.a. General observation during the past years is that the so-called 'phodu junta' gets placed during the first two weeks in the class 'A' or 'dream job' category; and by 'phodu', we mean a decent CGPA and an impressive overall profile. Anything above 7.5 is treated as equal for most of the companies (except perhaps some R&D, IT, finance and consulting giants that demand a CGPA in excess of 8).Although if you are below the 7.5 mark, you shouldn't worry yourself sick as generally after the first week, companies drop this CGPA barrier. Broadly, there are companies with 5 different job profiles, namely IT, Field/Plant, Consultancy, R & D, and Banking/Finance. Mostly, only the R & D companies, (like IBM IRL, ISRO, etc.) are branch specific and demand subject knowledge; other profiles such as analytics (ZS associates, Capital one, Delloite) or banking(Deutsche Bank, Barclays)do not seek a thorough knowledge of your field of engineering (although it will be much better if you do). Interviews for such companies are generally based on HR questions, general awareness, and most importantly, quantitative aptitude, data interpretation and situational decision-making.

Page 22 - Spring 2011

Geek Gazette


In general, the companies offering highest pay packages are the ones offering field jobs like oil giant Schlumberger or mining heavyweight Rio Tinto. This year, packages worth 18-22 lacs p.a and 26 lacs p.a .were offered by these companies, respectively. Their lucrative pay scales attract almost everyone aspiring for a decent job (mind it, this time at least 500 students appeared for Schlum's resume-based shortlisting process).Active involvement in extracurricular activities along with good technical knowledge and leadership skills are given preference. The highest average compensation is generally bagged by the CSE (or CSI) students, the EC and Electrical departments following close behind. 'Google' and 'Facebook' are names that incite awe in the institute's most talented programmers and budding computer engineers. 'Texas Instruments', 'GE' and 'Sun' are the most fitting platforms for Electrical engineers who hold their branch dear. Departments such as Paper and Pulp Technology, Metallurgical and Materials Engineering, Chemical Engineering and Biotechnology do not fare as well as their fellow counterparts, as their core companies usually do not offer high starting salaries. But that should not deter your spirits if you are a student of these disciplines, since more often than not, it is students from Mechanical, Chemical and Metallurgical Engineering who make their way into Schlumberger and Rio Tinto. Besides, numerous students make it to the plethora of IT and financial services firms that visit the campus. If you possess a sound profile, nothing can hold you back from securing a decent professional career. To summarize, the placement scenario this year has improved as compared to last year and more companies offering better packages have arrived on the scene. Ending on a positive note, for those of you having low CGPAs(less than 7), it's not the end of the world as most of the companies coming after week 1 might well turn out to be your prospective employers.

A glimpse at this year's placement scenario (as per figures quoted by the Placement Committee): DISTRIBUTION OF COMPANY PROFILES:


GENERAL TIPS ·

Try to keep your CGPA above 7.5 but if it is less, you should be able to justify with valid reasons –such as deteriorating interest in technical subjects, involvement in extra-curriculars, etc.

·

Summer training and Projects have considerable weightage for all company profiles. An internship in the finance sector/industry may help one secure a job in the consultancy/banking sector. To land in core/R&D companies, one should lay greater emphasis on academic projects.

·

Sort out your preferences beforehand; this will help you go for internships in the sector you are interested, which will finally benefit you during the time of placements.

·

Start preparation a few months in advance and make sure you solve past year questions (technical and aptitude). This will bolster your chances of making it through the written test.

·

Also prepare well on your resume, especially projects and internships. You should be able to defend your credentials. It would be advisable not to mention things you don't have sufficient knowledge of.

·

Attend the pre-placement talk of the companies you are interested in, and jot down important points – it helps during HR interviews because now you know what the company wants to hear from you.

Page 24 - Spring 2011

Geek Gazette


CAMPUS TRENDS Google products can be defined in 3 fundamental terms - Simple, Fast, and Free. In today's changing Information Technology scenario, Google Inc. has seemingly overpopulated itself with technologies. It has set new benchmarks in internet services, and a number of competitors have followed in its wake. Some of the latter dilettantes have consciously incorporated novel tactics and innovation into their services to keep up with the internet giant, while few others have blatantly ignored technology, perhaps out of fiscal pressures. We present to you the results of a survey conducted by GeekGazette, to ascertain whether people still possess a soft corner when it comes to Google, or whether ingenious newcomers are eventually catching up.

“ ARE

Facebook, the strongest rival to Google in recent years, introduced Facebook messages at the start of 2011, but it seems the nascent FB mail has failed to impress people. According to the survey, about 37% of IITR's webmail users reckon that the new FB mail stands clearly defeated against the already established Gmail, which is quite understandable as the former has no incorporation of IMAP to access your mail from other clients, the concepts of 'Drafts' and 'Folders' seem to have been forgotten, and the beloved 'CC' and 'Bcc' fields are conspicuously missing. Another thing that is odd is that life-time conversation threads only really work for one-to-one relationships. Group messages appear to be the consequence of an introspective afterthought.

PRODUCTS STILL POPULAR ? “

The Android OS has often reveled in limelight. But has it managed to justify its much publicized capabilities? The answer is perhaps yes. About 48% of voters think that Android OS is superior to Apple's iOS 4 and Windows Phone 7 in most aspects of mobile technology. Android has consistently been adapting to its users changing needs. Not only is it open, but it also offers iPhone-like menus and apps, with Windows Mobile-esque icons, and Palm Pre-analogous multitasking. 20% of the sample population feels that it's “just another flop product from Google, missing out on certain functionalities like HID profiles, Stereo Bluetooth support, native Wi-Fi support and the like.

When it comes to Internet browsers, Firefox is still the most trusted. 40% users rated Chrome as awesome and a surprising 58% rated it as a good browser. Safari and Opera figure as alternatives. Internet Explorer lags far

behind (You probably know that already).

The Google factor! To add the final touch to the Google report-card, we asked users for their preferred products in definite service categories. The response turned out to be in line with expectations. Gmail was the first choice for 90% of users – asserting its monopoly in email services. GTalk, which allows for a better flexibility than other IM programs, along with Picasa, containing features like API, tagging and geotagging - both received the majority share of user preferences .Products like chrome, android, docs, Google groups received a healthy average response. On the darker front, Google buzz and curiously, Orkut, were perceived as the biggest flops by the company.

Success mantras, anybody? The aforementioned statistics are strictly according to the 430 responses to the GeekGazette questionnaire posted on Channel I. If you feel you can provide us with more plausible insights, or better still, feedback, get in touch with our student tech-experts at ieeegeekgazette@gmail.com



As much as Mario's A for jump and B for fireball spewing seemed like second nature to gamers, it was

Believe it or not, Mario was written in straight up assembly language. The creation of the game was a result of a horde of different factors. It was the vision of Shigeru Miyamato, the designer of Super Mario Bros, to create a game which would become the quintessential 8bit cartridge-based video game. He wanted to build upon a tradition of “athletic games”, where you controlled a character in a platform environment with a lot of jumping involved.

E

veryone knows a n d loves the fireballspewing plumber who jumps Koopa Troopas and Goombas to death. However, did you know that Mario could also have been a gunhauling, cloud-jumping turtle shooter? It took a lot of conceptualization to bring the game to where it is today.

Mario wasn't always meant to eat magical mushrooms and grow up either. The game was originally built around the small-sized Mario but he was made larger in the final version. This gave the development team the idea to introduce an item to let Mario get bigger while progressing in the level. Hence the concept of Magic Mushrooms. Since the game is set in a magica

The original enemy killing mechanism went along the lines of hitting the enemy from below to knock them out. A direct contact with the enemy would kill Mario. This turned out a tad boring and finally the notion of jumping on the turtles to knock them out and then jumping again for the coup de grâce was settled on, allowing players to jump on turtles to their heart's content. The developmental phase also contained levels involving jumping on clouds and shooting enemies. However, as the development progressed, bullets were removed, and the cloudy bonus stages are all that remain of the idea.

much harder to think about for the developers. For the majority of the development phase, the controls were set as A for shooting bullets, B for dashing and up on the D-pad to jump. However, the developers thought that a dashing, shooting Mario would have a pretty big advantage, so the B button was configured to shoot fireballs. This, in turn, allowed for the A button to be assigned for jump.

The gaming world has come a long way since the early days of Mario. But, in spite of its obvious shortcomings, fact remains that Mario is one of the most popular games ever created. It has managed to attract the attention of amateurs and keep avid gamers glued to their computer or videogame screens, looking out cautiously for that malefic turtle or that sudden crevasse that would signal the end for the little plump man. In the age of 'Assassin's Creed' and 'Fallout', Mario, notwithstanding its simple plot and elementary graphics, has retained its stature as an object of admiration in the gaming fraternity.

Mario also contained a few glitches - such as the infinite 1-UP trick, which involved repeatedly tossing a turtle shell against a block. As it turns out, it was intentionally left in the code; but what surprised the developers was how easily players mastered the supposedly complicated trick. The most famous glitch ever, the 'Minus World', also featured in the game. It was an underground water-level with the code “ -1”, where the blank in “ -1” was actually the number 36, for which the game had the tile with a blank space. Due to this blank space, the world title appeared on the screen as “ -1” instead of “36-1”, giving the glitch its name. The 'Minus World' glitch has since been removed from remakes of Super Mario Bros.

l kingdom, and there are always tales about people wandering off into the jungle and eating magic mushrooms, it seemed like a good idea to let Mario in on the mushrooms as well. This, in turn, led to the infamous title of the game world, “The Mushroom Kingdom”.


THEORY OF EVERYTHING

STRING T

he Universe has always been an unfathomable web of secrets for mankind. Its origin and evolution has engaged and catered to scientists' curiosity across generations. Geek Gazette rewinds the time machine to get a closer glimpse of human beings’ understanding of the universe and its origins.

It was year 1968, when Gabriel Veneziano, an Italian theoretical physicist, then searching for a mathematical model for the strong nuclear force, found a 200 year old set of equations by Leonard Euler. This set of equations, defined as the gamma function, was the first step towards the well known STRING THEORY which was to challenge the little understanding of universe we had back then. The gamma function, after its inception, had become quite the subject of discussion among scientists. Leonard Suskin one such ambitious young physicist, knew that this function represented the strong force, but was perplexed by the fact that it depicted a particle with a structure and ability to vibrate. What was being described here was a string (like a rubber band) that could not only contract and expand but also wiggle. The theory, although fascinating, didn't go well with the particle physicists of that time who considered the universe to be made up of particles and not strings. The string theory didn't live for long. Meanwhile, mainstream science was embracing particles as points and not strings. New particles were being discovered sporadically. Strangely, scientists came up with a strange prediction - even the forces of nature can be explained by particles. They explained that two particles in nature interact by exchange of messenger particles. The more such particles are exchanged, the stronger the force gets. According to

Page 28 - Spring 2011

theory

them, if we went back to the time just after the big bang, the high temperatures would have caused the Electromagnetic (EM) and Weak Nuclear forces to merge and form the Electroweak (EW) force. And even before that, higher temperatures fused the EW and Strong Nuclear forces into the GRAND UNIFIED FORCE. But this 'Grand Unified Theory' didn't account for the principal force in nature, gravity. All this while, some physicists were still pursuing the string theory in the hope to prove it. However, this wasn't easy due to the sheer improbability of the authenticity of the corollaries that arose as a result of the theory. For example, String theory predicted the existence of a particle, TACHYON, which could only travel at speeds greater than light. It also predicted existence of a 10-dimensional universe, and a mass-less particle that had been never observed in experiments. The problem with String Theory was that it made startling predictions without concrete empirical evidences, but scientists eventually found a way out. One explanation put forward was: “When we observe a tiny object, it looks like a point, but upon microscopic observation, its dimensions open up. Similarly, it is expected that the 6 remaining dimensions which are unobserved are so curled up and confined into themselves that they cannot be observed by common eye, and it is these shapes that give the strings their fundamental frequencies.” By 1973, only a few physicists were still pursuing the string theory, one of them being JOHN SWARTZ. He persisted, combining equations, rearranging variables, in an effort to make sense of it all. He was almost on the verge of abandoning the theory, when he suddenly had a brainstorm. He observed that the equation resembled the force of gravitation, although, for that to be possible,

Geek Gazette


the variable quantities had to be really small. To account for this, he suggested that if we scale ourselves to a size about 1030 times smaller than an atom, we would be able to comprehend that the massless particle that had never been observed in experiments was actually a GRAVITON - the particle that transmits the force of gravity. Graviton was the particle that everybody had been looking for all along the missing piece of the GUT's standard model. But yet again, the theory fell on deaf ears.

mathematical formulation but sharing the same basic idea of strings. Thankfully, EDWARD WITTEN, a physicist regarded by many as the successor of Einstein in the world of particle physics, came to the rescue.

However, this new version of String theory was a big improvement in the sense that it was capable of describing the existence of all particles in nature. According to this theory, strings vibrate at different frequencies and it's the combination of different resonating frequencies that gives particles their properties, such as mass and charge. Not only did it give an elegant picture of the universe, but also resolved the conflict between quantum mechanics and general relativity working on different scales.

But the new theory was accompanied by new assumptions. M-theory proposed that the universe consisted of not 10 but 11 dimensions. The extra dimension allowed strings to stretch to form a membrane. A membrane could have 3 dimensions or more, and with enough energy, could grow to an enormous size, even as large as the universe. Witten postulated that 'our' universe is probably lying on a membrane in a much higher dimensional system. This opened the frontiers for what are known as 'Parallel universes' lying on other membranes.

By 1984, the idea caught up and more scientists started pursuing the string theory. The most attractive prospect was that the strings could vibrate in different manners and frequencies to give different particles. Hence, just by knowing the properties of these strings one could explain every aspect of the universe, from tiny atoms to large galaxies, thus taking science one step closer to developing the THEORY OF EVERYTHING. But this idea went so far that scientists eventually ended up with 5 different versions of string theory: Strings I, IIA and IIB, compiled in the superstring theory, in contrast to heterotic SO(32) and heterotic E8xE8, compiled in the Bosonic string theory; each one different in Some physicists think humanity is close to discovering “Theory of Everything”.

Well, we couldn’t actually predict

The theory would only directly describe the “fundamental interactions” of nature. The world is just too complex to be predict

In 1995, Witten formulated a simple equation which suggested that the 5 versions of string theory weren't actually 5 different theories, but 5 different ways to look at the same model. His work was so revolutionary that it was given its own name, THE M-THEORY.

If the M theory is to become the Theory of Everything, it must account for the Big Bang. Though the Big Bang theory nicely explains the beginning of the universe, it doesn't tell us why the Big Bang occurred in first place. M theory provides an explanation. According to it, the 'multiverse' consists of multiple membranes of alike universes and these membranes might drift towards each other and collide. If M Theory is to be believed, this kind of collision is exactly what happened at the time of Big Bang, releasing enormous amounts of energy that ultimately created the universe. Scientists claim that such collisions may happen again and again in future.

SUPERLATIVE


M theory also explains why gravity is so weak in comparison to the other 3 fundamental forces of nature. Until string theory, universe was thought to consist of looped strings but M theory assumes strings to be open ended, with ends stuck up on our membrane. However, some strings do exist as loops, one of them being the graviton. With no loose ends, gravitons are free to escape our membrane, making its effect much weaker than others. Incidentally, this also provides for an immense opportunity. If we do live on membranes and if parallel universes do exist, then we may not be able to see or feel them, but we can get to know about their existence through exchange of gravitons. However, despite its omniscient nature, the biggest threat to the M-Theory today is that IT CANNOT BE PROVED. It's impossible to go to the scale of strings to check their existence. M Theory, as of now, lacks an experimental basis. An effort is being made to at least partially make amends for that in the CERN and FERMI Laboratories where enormous atom smashers are at work. A head-on collision of ions or protons in these colliders causes a shower of tiny particles. It is believed that somewhere hidden amongst these particles is a tiny string of gravity - the graviton. Unlike other strings, it is unbounded and thus free to fly off from 'our' membrane to the extra dimensions due to its high energy. This can be inferred from the absence of graviton from snapshots taken rapidly during and after the collision. Along with the missing graviton, another particle that physicists are looking for in these collisions is the SPARTICLE - the heavier counterparticle for every particle. But the problem with it is that because of being extremely heavy (on the subatomic scale), it cannot be detected by atom smashers. However, the most annoying catch about M Theory is that even if somehow we are able to detect the presence of sparticles or the missing gravitons, it won't entirely prove the M theory. In fact, even then it would merely be a justification for the existence of strings or membranes and not a proof of M Theory in itself. Still, it will mean vindication for all those scientists who have persisted with the theory, and an indication that physicists are on the right track towards finding the THEORY OF EVERYTHING.

Page 30 - Spring 2011

Geek Gazette


1861: James Clark Maxwell demonstrates color photography using the color separation method.

TIMELINE

PHOTOGRAPHY 12th Century: Robert Bacon invents the camera 'Obscura' – essentially a dark box or a room with a hole at one end.

1727: J Schulze accidentally creates the first ever photosensitive compound.

1826: “Joseph” Niepce burns the first ever permanent image on a chemical coated pewter plate.

1877: American Eadweard Muybridge develops a fast shutter to photograph objects in motion. 1888-89: 1st Kodak camera launched with 20 foot roll of paper. Later improved using a film instead of paper. 1906-07: Photostat developed. The Lumiere Brothers introduce 'Autochrome' First color photography system that can be used by amateurs.

1917: Nippon Kogaku K.K., which will eventually become Nikon, established in Tokyo.

1931: Harold Edgerton (then a Sc.D student at MIT), develops the stroboscope, ushering the era of high-speed photography.

1936: Kodak develops Kodachrome, first color multi-layered color film; Exakta pioneering 35mm single-lens reflex (SLR) camera launched. 1947: Edwin H. Land announces his invention of the Polaroid camera. Starts sales of instant black and white film the following year. 1957: Russel Kirsch creates the first ever digital image (176*176 pixels) by scanning a photograph of his 3-month old son.

1963: First color instant film developed by Polaroid; Instamatic released by Kodak. 1982-83: Sony demonstrates Mavica "still video" camera; Kodak introduces the disk camera.

1985: Minolta markets the world's first autofocus SLR system (called "Maxxum" in the US).

1990-91: Adobe Photoshop released. Kodak launches the first digital SLR camera – Kodak DCS-100. 1992: Kodak introduces Photo CD. JPEG, a compression standard, gets published in an IEEE paper. First photo published on the net. 1999: Nikon announces release of the D1 (2.74 Megapixels at a cost of $6000), its first professional digital SLR camera.

2000: World's first camera phone released by Sharp (the JSH04) in Japan.

2008: Swiss company, Sietz Phototechnik creates a milestone with its humongous 160 Megapixel camera. Weighing over 10 pounds and priced at USD 45000, the camera takes one complete second to take a full resolution 7500*21500 pixel shot. Each image, not surprisingly, takes over 307 MB of storage space.


“ WITH GREAT POWER COMES GREAT RESPONSIBILITY “ The words above may serve as a reminder of the popular movie but this is not about superheroes in tight overalls and with a misplaced sense of wardrobe ethics, although it does have a farfetched gossamer connection. The vigilante has transformed – from the insect bitten and Dunst smitten spider to a teenager behind thickrimmed glasses obsessed with hash tags, open source and the web. The author wishes to take a bow as you appreciate the gossamer connection. Today's vigilante does not have the batmobile; it has an I.P. Address – http://213.251.145.96/ . If you couldn't translate that to Wikileaks, we find your lack of faith disturbing (#popular #cliches). Wikileaks, as everyone except for the gun-toting tribals from Mozambique, by now knows, is a non profit organisation that provides an anonymous and secure platform for people to publish information of ethical, political and historic significance. Julian Assange is

Page 32 - Spring 2011

the man and the whistleblower who fuels the organisation. First coming to limelight with the release of a classified US military video depicting indiscriminate firing on Iraqi civilians, Wikileaks recently made its biggest disclosure ever, exposing over 250,000 highly confidential US Embassy diplomatic cables. The leaks have sent shockwaves throughout the world and have put governments and world-leaders in embarrassing situations (remember Dmitry Medvedev's political fiasco or Hillary Clinton's double-faced diplomacy?). Not surprisingly, Wikileaks is being hailed as the harbinger of the Internet-age aggressive journalism, sparking uncomfortable debates on issues such as the freedom of press and its underlying incompatibility with the socalled larger “public-interest”. A friend in need is a friend indeed but he is usually lonely, and the going hasn't been easy for Wikileaks either. Wikileaks watched in dismay as its most crucial supporters including PayPal, Mastercard and Amazon among others distanced themselves from it due to political pressures. Julian Assange was accused of rape by two Swedish girls, which was allegedly a staged show to put Wikileaks on the backfoot. Bank of America hired three firms to discredit Wikileaks by publishing false information on the site. With so much political antagonism, Wikileaks was crumbling down. That's when the men stepped in. These men - a part of the huge

Geek Gazette


Internet community called 'Anonymous' - were basically a group of hacking activists (or hackitivists). They had an operation called 'Operation Payback' which was till then, mostly directed against the opponents of Internet piracy. They spent quality time attacking the attackers of torrent sites. What is piracy for you is freedom for them, so says their motto. Wikileaks was the pulpit of free speech on the Internet, and a definite favourite among these men as you might have guessed. They were furious at the loss of support from various organisations that cowed down as the heat turned up and parted ways with Wikileaks. Today a fury, tomorrow a hack. This anonymous group of people decided to attack the PayPal and Amazon servers and launched a DDoS attack (Distributed Denial of Service – an attack which involves overwhelming a server by sending it so many illegitimate requests so as to render it unable to serve any genuine ones). PayPal came down, MasterCard followed. Amazon survived, thanks to their cloud. In response, HBGary, one of the firms hired by the Bank of America to mess up with Wikileaks, was given the responsibility of exposing this anonymous group. Aaron Barr, HBGary's CEO, underestimating the force that was anonymity, claimed that he knew the exact identities of this group's members and that they shall soon be exposed. HBGary's website was hacked the next day. Aaron's email archive was published on a Russian Website and his personal details were posted to his Twitter account. The superheroes had arrived – they were not made of kryptonite perhaps but they were formidable nonetheless. The whole WikiLeaks and Operation Payback phenomenon has brought to light the power of the web and the responsibility that the powerful must assume. Free speech that exposes corruption, inefficient governance, internationally sensitive information and any opinion of larger public-interest must be protected. The responsibility is clear – make them pay back. When the political clout is strong and on the wrong side, the crook becomes a necessity. Hacking, in our opinion, is ethical as long as it promotes public interest and curbs the forces that silence people's voice. Your opinions may differ and we would love to have them on our Facebook discussions page: http://www.facebook.com/geekgazette.


E CURTA MECHANICAL CALCULATOR

ver imagined using an object resembling a pepper grinder for making hard core mathematical calculations? Enter the Curta, the first mechanical calculator capable of performing calculations involving real numbers and roots with consummate ease. The creation of Curt Herzstark, a Jew, who, while in German captivity, designed this small but impressive number cruncher. And amazingly, a present gifted to Hitler at the end of World War II.

The device can perform complex calculations, such as finding square roots, by breaking down any mathematical function into a series of iterations of additions or subtractions. The Curta was popular among car rally enthusiasts through the 1970s and the 1980s.Its cost (about $600), coupled with the advent of cheap and portable computers, has made the Curta obsolete but it is still rated among the top 100 figments of retro technology.

Solution to PenRose Sudoku

Page 34 - Spring 2011

Geek Gazette




Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.