The Geeks Daily
Menu

digit

 

Page 1 of 7 1 2 3 4 5 6 7 LastLast
Results 1 to 10 of 68

Thread: The Geeks Daily

  1. #1
    Wise Old Owl sygeek's Avatar
    Join Date
    Apr 2011
    Location
    Lucknow
    Posts
    1,876

    Default The Geeks Daily

    This thread is meant for sharing interesting articles related to Technology and Geekism on a wide base such that it doesn't fit in Technology news section, Random news section or the OSS article thread.

    Now, The Rules:
    1. Please don't copy-paste the entire article if the site's Terms and Conditions doesn't allow it. A link with a summary of the article in quotes would be better.
    2. If the site doesn't have any "Terms and conditions" or it allows for article to be fully published (with a link to the site), then you are free to paste the entire article.
    3. Keep in mind while pasting a full article that it should be under SPOILER tags - |SPOILER][/SPOILER|.
    4. Custom written articles can be posted here too. Add a [Custom] Tag to topics of such posts.
    5. Please send trackbacks to the site whose article you are using in the post.
    6. Discussions related to a corresponding article is allowed unless and until it sticks to the topic.
    7. Off-topic posts and Posts not following the corresponding site's T&C will be immediately reported to the Mods.
    Last edited by sygeek; 07-06-2011 at 09:31 AM.
    Popular Gadget Deals
    [B]AMD FX 6300 | Asus M5A97 Evo R2.0 | Saophire HD 7870 Ghz Edition | Dell S2204L | WD Blue 1TB | Kingston HyperX Blu 4GB | Seasonic S12ii 520W | Samsung 24x DVDRW | Corsair 200R | Dell Keyboard | Logitech MX518 | Edifer X600 | Microtek 1kVA UPS[/B]

  2. #2
    Wise Old Owl sygeek's Avatar
    Join Date
    Apr 2011
    Location
    Lucknow
    Posts
    1,876

    Default CPU vs. The Human Brain


    Spoiler:

    The brain's waves drive computation, sort of, in a 5 million core, 9 Hz computer.

    Computer manufacturers have worked in recent years to wean us off the speed metric for their chips and systems. No longer do they scream out GHz values, but use chip brands like atom, core duo, and quad core, or just give up altogether and sell on other features. They don't really have much to crow about, since chip speed increases have slowed with the increasing difficulty of cramming more elements and heat into ever smaller areas. The current state of the art is about 3 GHz, (far below predictions from 2001), on four cores in one computer, meaning that computations are spread over four different processors, which each run at 0.3 nanosecond per computation cycle.

    The division of CPUs into different cores hasn't been a matter of choice, and it hasn't been well-supported by software, most of which continues to conceived and written in linear fashion, with the top-level computer system doling out whole programs to the different processors, now that we typically have several things going on at once on our computers. Each program sends its instructions in linear order through one processor/core, in soda-straw fashion. Ever-higher clock speeds, allowing more rapid progress through the straw, still remain critical for getting more work done.

    Our brains take a rather different approach to cores, clock speeds, and parallel processing, however. They operate at variable clock speeds between 5 and 500 Hertz. No Giga here, or Mega or even Kilo. Brain waves, whose relationship to computation remains somewhat mysterious, are very slow, ranging from the delta (sleep) waves of 0-4 Hz through theta, alpha, beta, and gamma waves at 30-100+ Hz which are energetically most costly and may correlate with attention / consciousness.

    On the other hand, the brain has about 1e15 synapses, making it analogous to five million contemporary 200 million transistor chip "cores". Needless to say, the brain takes a massively parallel approach to computation. Signals run through millions of parallel nerve fibers from, say, the eye, (1.2 million in each optic nerve), through massive brain regions where each signal traverses only perhaps ten to twenty nerves in any serial path, while branching out in millions of directions as the data is sliced, diced, and re-assembled into vision. If you are interested in visual pathways, I would recommend Christof Koch's Quest for Consciousness, whose treatment of visual pathways is better than its treatment of other topics.

    Unlike transistors, neurons are intrinsically rhythmic to various degrees due to their ion channel complements that govern firing and refractory/recovery times. So external "clocking" is not always needed to make them run, though the present articles deal with one such case. Neurons can spontaneously generate synchrony in large numbers due to their intrinsic rhythmicity.

    Nor are neurons passive input-output integrators of whatever hits their dendrites, as early theories had them. Instead, they spontaneously generate cycles and noise, which enhances their sensitivity to external signals, and their ability to act collectively. They are also subject to many other influences like hormones and local non-neural glial cells. A great deal of integration happens at the synapse and regional multi-synapse levels, long before the cell body or axon is activated. This is why the synapse count is a better analog to transistor counts on chips than the neuron count. If you are interested in the topics of noise and rhythmicity, I would recommend the outstanding and advanced book by Gyorgy Buzsaki, Rhythms of the Brain. Without buying a book, you can read Buzsaki's take on consciousness.

    Two recent articles (Brandon et al., Koenig et al.) provide a small advance in this field of figuring out how brain rhythms connect with computation. Two groups seem to have had the same idea and did very similar experiments to show that a specific type of spatial computation in a brain area called the medial entorhinal cortex (mEC) near the hippocampus depends on theta rhythm clocking from a loosely connected area called the medial septum (MS). (In-depth essay on alcohol, blackouts, memory formation, the medial septum, and hippocampus, with a helpful anatomical drawing).

    Damage to the MS (situated just below the corpus collosum that connects the two brain hemispheres) was known to have a variety of effects on functions not located in the MS, but in the hippocampus and mEC, like loss of spatial memory, slowed learning of simple aversive associations, and altered patterns of food and water intake.

    The hippocampus and allied areas like the mEC are one of the best-investigated areas of the brain, along with the visual system. They mediate most short-term memory, especially spatial memory (i.e rats running in mazes). The spatial system as understood so far has several types of cells:

    Head direction cells, which know which way the head is pointed (some of them fire when the head points at one angle, others fire at other angles.

    Grid cells, which are sensitive to an abstract grid in space covering the ambient environment. Some of these cells fire when the rat is on one of the grid boundaries. So we literally have a latitude/logitude-style map in our heads, which may be why map-making comes so naturally to humans.

    Border cells, which fire when the rat is close to a wall.

    Place cells, which respond to specific locations in the ambient space- not periodically like grid cells, but typically to one place only.

    Spatial view cells, which fire when the rat is looking at a particular location, rather than when it is in that location. They also respond, as do the other cells above, when a location is being recalled rather than experienced.

    Clearly, once these cells all network together, a rather detailed self-orientation system is possible, based on high-level input from various senses (vestibular, whiskers, vision, touch). The role of rhythm is complicated in this system. For instance, the phase relation of place cell firing versus the underlying theta rhythm, (leading or following it, in a sort of syncopation), indicates closely where the animal is within the place cell's region as movement occurs. Upon entry, firing begins at the peak of the theta wave, but then precesses to the trough of the theta wave as the animal reaches the exit. Combined over many adjacent and overlapping place fields, this could conceptually provide very high precision to the animal's sense of position.


    One rat's repeated tracks in a closed maze, mapped versus firing patterns of several of its place cells, each given a different color.

    We are eavesdropping here on the unconscious processes of an animal, which it could not itself really articulate even if it wished and had language to do so. The grid and place fields are not conscious at all, but enormously intricate mechanisms that underlie implicit mapping. The animal has a "sense" of its position, (projecting a bit from our own experience), which is critical to many of its further decisions, but the details don't necessarily reach consciousness.

    The current papers deal not with place cells, which still fire in a place-specifc way without the theta rhythm, but with grid cells, whose "gridness" appears to depend strongly on the theta rhythm. The real-life fields of rat grid cells have a honeycomb-like hexagonal shape with diameters ranging from 40 to 90cm, ordered in systematic fashion from top to bottom within the mEC anatomy. The theta rhythm frequency they respond to also varies along the same axis, from 10 to 4 Hz. These values stretch and vary with the environment the animal finds itself in.


    Field size of grid cells, plotted against anatomical depth in the mEC.

    The current papers ask a simple question: do the grid cells of the mEC depend on the theta rhythm supplied from the MS, as has long been suspected from work with mEC lesions, or do they work independently and generate their own rhythm(s)?

    This was investigated by the expedient of injecting anaesthetics into the MC to temporarily stop its theta wave generation, and then polling electrodes stuck into the mEC for their grid firing characteristics as the rats were freely moving around. The grid cells still fired, but lost their spatial coherence, firing without regard to where the rat was or was going physically (see bottom trajectory maps). Spatial mapping was lost when the clock-like rhythm was lost.


    One experimental sequence. Top is the schematic of what was done. Rate map shows the firing rate of the target grid cells in a sampled 3cm square, with m=mean rate, and p=peak rate. Spatial autocorrelation shows how spatially periodic the rate map data is, and at what interval. Gridness is an abstract metric of how spatially periodic the cells fire. Trajectory shows the rat's physical paths during free behavior, overlaid with the grid cell firing data.

    "These data support the hypothesized role of theta rhythm oscillations in the generation of grid cell spatial periodicity or at least a role of MS input. The loss of grid cell spatial periodicity could contribute to the spatial memory impairments caused by lesions or inactivation of the MS."

    This is somewhat reminiscent of an artificial computer system, where computation ceases (here it becomes chaotic) when clocking ceases. Brain systems are clearly much more robust, breaking down more gracefully and not being as heavily dependent on clocking of this kind, not to mention being capable of generating most rhythms endogenously. But a similar phenomenon happens more generally, of course, during anesthesia, where the controlled long-range chaos of the gamma oscillation ceases along with attention and consciousness.

    It might be worth adding that brain waves have no particular connection with rhythmic sensory inputs like sound waves, some of which come in the same frequency range, at least at the very low end. The transduction of sound through the cochlea into neural impulses encodes them in a much more sophisticated way than simply reproducing their frequency in electrical form, and leads to wonders of computational processing such as perfect pitch, speech interpretation, and echolocation.

    Clearly, these are still early days in the effort to know how computation takes place in the brain. There is a highly mysterious bundling of widely varying timing/clocking rhythms with messy anatomy and complex content flowing through. But we also understand a lot- far more with each successive decade of work and with advancing technologies. For a few systems, (vision, position, some forms of emotion), we can track much of the circuitry from sensation to high-level processing, such as the level of face recognition. Consciousness remains unexplained, but scientists are definitely knocking at the door.


    Spoiler:

    You'd think it'd be easy to reboot a PC, wouldn't you? But then you'd also think that it'd be straightforward to convince people that at least making some effort to be nice to each other would be a mutually beneficial proposal, and look how well that's worked for us.

    Linux has a bunch of different ways to reset an x86. Some of them are 32-bit only and so I'm just going to ignore them because honestly just what are you doing with your life. Also, they're horrible. So, that leaves us with five of them.

    • kbd - reboot via the keyboard controller. The original IBM PC had the CPU reset line tied to the keyboard controller. Writing the appropriate magic value pulses the line and the machine resets. This is all very straightforward, except for the fact that modern machines don't have keyboard controllers (they're actually part of the embedded controller) and even more modern machines don't even pretend to have a keyboard controller. Now, embedded controllers run software. And, as we all know, software is dreadful. But, worse, the software on the embedded controller has been written by BIOS authors. So clearly any pretence that this ever works is some kind of elaborate fiction. Some machines are very picky about hardware being in the exact state that Windows would program. Some machines work 9 times out of 10 and then lock up due to some odd timing issue. And others simply don't work at all. Hurrah!
    • triple - attempt to generate a triple fault. This is done by loading an empty interrupt descriptor table and then calling int(3). The interrupt fails (there's no IDT), the fault handler fails (there's no IDT) and the CPU enters a condition which should, in theory, then trigger a reset. Except there doesn't seem to be a requirement that this happen and it just doesn't work on a bunch of machines.
    • pci - not actually pci. Traditional PCI config space access is achieved by writing a 32 bit value to io port 0xcf8 to identify the bus, device, function and config register. Port 0xcfc then contains the register in question. But if you write the appropriate pair of magic values to 0xcf9, the machine will reboot. Spectacular! And not standardised in any way (certainly not part of the PCI spec), so different chipsets may have different requirements. Booo.
    • efi - EFI runtime services provide an entry point to reboot the machine. It usually even works! As long as EFI runtime services are working at all, which may be a stretch.
    • acpi - Recent versions of the ACPI spec let you provide an address (typically memory or system IO space) and a value to write there. The idea is that writing the value to the address resets the system. It turns out that doing so often fails. It's also impossible to represent the PCI reboot method via ACPI, because the PCI reboot method requires a pair of values and ACPI only gives you one.



    Now, I'll admit that this all sounds pretty depressing. But people clearly sell computers with the expectation that they'll reboot correctly, so what's going on here?

    A while back I did some tests with Windows running on top of qemu. This is a great way to evaluate OS behaviour, because you've got complete control of what's handed to the OS and what the OS tries to do to the hardware. And what I discovered was a little surprising. In the absence of an ACPI reboot vector, Windows will hit the keyboard controller, wait a while, hit it again and then give up. If an ACPI reboot vector is present, windows will poke it, try the keyboard controller, poke the ACPI vector again and try the keyboard controller one more time.

    This turns out to be important. The first thing it means is that it generates two writes to the ACPI reboot vector. The second is that it leaves a gap between them while it's fiddling with the keyboard controller. And, shockingly, it turns out that on most systems the ACPI reboot vector points at 0xcf9 in system IO space. Even though most implementations nominally require two different values be written, it seems that this isn't a strict requirement and the ACPI method works.

    3.0 will ship with this behaviour by default. It makes various machines work (some Apples, for instance), improves things on some others (some Thinkpads seem to sit around for extended periods of time otherwise) and hopefully avoids the need to add any more machine-specific quirks to the reboot code. There's still some divergence between us and Windows (mostly in how often we write to the keyboard controller), which can be cleaned up if it turns out to make a difference anywhere.

    Now. Back to EFI bugs.
    Last edited by sygeek; 04-06-2011 at 06:18 AM.
    [B]AMD FX 6300 | Asus M5A97 Evo R2.0 | Saophire HD 7870 Ghz Edition | Dell S2204L | WD Blue 1TB | Kingston HyperX Blu 4GB | Seasonic S12ii 520W | Samsung 24x DVDRW | Corsair 200R | Dell Keyboard | Logitech MX518 | Edifer X600 | Microtek 1kVA UPS[/B]

  3. #3
    Wise Old Owl sygeek's Avatar
    Join Date
    Apr 2011
    Location
    Lucknow
    Posts
    1,876

    Default Re: The Geeks Daily

    Ten Oddities And Secrets About JavaScript
    Visit link for full article

    JavaScript. At once bizarre and yet beautiful, it is surely the programming language that Pablo Picasso would have invented. Null is apparently an object, an empty array is apparently equal to false, and functions are bandied around as though they were tennis balls.

    This article is aimed at intermediate developers who are curious about more advanced JavaScript. It is a collection of JavaScript’s oddities and well-kept secrets. Some sections will hopefully give you insight into how these curiosities can be useful to your code, while other sections are pure WTF material. So, let’s get started.

    Spoiler:
    1. Null is an Object
    2. NaN is a Number
    3. An Array With No Keys == False (About Truthy and Falsy)
    4. replace() Can Accept a Callback Function
    5. Regular Expressions: More Than Just Match and Replace
    6. You Can Fake Scope
    7. Functions Can Execute Themselves
    8. Firefox Reads and Returns Colors in RGB, Not Hex
    9. 0.1 + 0.2 !== 0.3
    10. Undefined Can Be Defined
    [B]AMD FX 6300 | Asus M5A97 Evo R2.0 | Saophire HD 7870 Ghz Edition | Dell S2204L | WD Blue 1TB | Kingston HyperX Blu 4GB | Seasonic S12ii 520W | Samsung 24x DVDRW | Corsair 200R | Dell Keyboard | Logitech MX518 | Edifer X600 | Microtek 1kVA UPS[/B]

  4. #4
    Wise Old Owl sygeek's Avatar
    Join Date
    Apr 2011
    Location
    Lucknow
    Posts
    1,876

    Default Re: The Geeks Daily

    By James Somers

    Spoiler:


    When Colin Hughes was about eleven years old his parents brought home a rather strange toy. It wasn't colorful or cartoonish; it didn't seem to have any lasers or wheels or flashing lights; the box it came in was decorated, not with the bust of a supervillain or gleaming protagonist, but bulleted text and a picture of a QWERTY keyboard. It called itself the "ORIC-1 Micro Computer." The package included two cassette tapes, a few cords and a 130-page programming manual.

    On the whole it looked like a pretty crappy gift for a young boy. But his parents insisted he take it for a spin, not least because they had just bought the thing for more than £129. And so he did. And so, he says, "I was sucked into a hole from which I would never escape."

    It's not hard to see why. Although this was 1983, and the ORIC-1 had about the same raw computing power as a modern alarm clock, there was something oddly compelling about it. When you turned it on all you saw was the word "Ready," and beneath that, a blinking cursor. It was an open invitation: type something, see what happens.

    In less than an hour, the ORIC-1 manual took you from printing the word "hello" to writing short programs in BASIC -- the Beginner's All-Purpose Symbolic Instruction Code -- that played digital music and drew wildly interesting pictures on the screen. Just when you got the urge to try something more complicated, the manual showed you how.

    In a way, the ORIC-1 was so mesmerizing because it stripped computing down to its most basic form: you typed some instructions; it did something cool. This was the computer's essential magic laid bare. Somehow ten or twenty lines of code became shapes and sounds; somehow the machine breathed life into a block of text.

    No wonder Colin got hooked. The ORIC-1 wasn't really a toy, but a toy maker. All it asked for was a special kind of blueprint.

    Once he learned the language, it wasn't long before he was writing his own simple computer games, and, soon after, teaching himself trigonometry, calculus and Newtonian mechanics to make them better. He learned how to model gravity, friction and viscosity. He learned how to make intelligent enemies.

    More than all that, though, he learned how to teach. Without quite knowing it, Colin had absorbed from his early days with the ORIC-1 and other such microcomputers a sense for how the right mix of accessibility and complexity, of constraints and open-endedness, could take a student from total ignorance to near mastery quicker than anyone -- including his own teachers -- thought possible.

    It was a sense that would come in handy, years later, when he gave birth to Project Euler, a peculiar website that has trained tens of thousands of new programmers, and that is in its own modest way the emblem of a nascent revolution in education.


    * * *

    Sometime between middle and high school, in the early 2000s, I got a hankering to write code. It was very much a "monkey see, monkey do" sort of impulse. I had been watching a lot of TechTV -- an obscure but much-loved cable channel focused on computing, gadgets, gaming and the Web -- and Hackers, the 1995 cult classic starring Angelina Jolie in which teenaged computer whizzes, accused of cybercrimes they didn't commit, have to hack their way to the truth.

    I wanted in. So I did what you might expect an over-enthusiastic suburban nitwit to do, and asked my mom to drive me to the mall to buy Ivor Horton's 1,181-page, 4.6-pound Beginning Visual C++ 6. I imagined myself working montage-like through the book, smoothly accruing expertise one chapter at a time.

    What happened instead is that I burned out after a week. The text itself was dense and unsmiling; the exercises were difficult. It was quite possibly the least fun I've ever had with a book, or, for that matter, with anything at all. I dropped it as quickly as I had picked it up.

    Remarkably I went through this cycle several times: I saw people programming and thought it looked cool, resolved myself to learn, sought out a book and crashed the moment it got hard.

    For a while I thought I didn't have the right kind of brain for programming. Maybe I needed to be better at math. Maybe I needed to be smarter.

    But it turns out that the people trying to teach me were just doing a bad job. Those books that dragged me through a series of structured principles were just bad books. I should have ignored them. I should have just played.

    Nobody misses that fact more egregiously than the American College Board, the folks responsible for setting the AP Computer Science high school curriculum. The AP curriculum ought to be a model for how to teach people to program. Instead it's an example of how something intrinsically amusing can be made into a lifeless slog.


    I imagine that the College Board approached the problem from the top down. I imagine a group of people sat in a room somewhere and asked themselves, "What should students know by the time they finish this course?"; listed some concepts, vocabulary terms, snippets of code and provisional test questions; arranged them into "modules," swaths of exposition followed by exercises; then handed off the course, ready-made, to teachers who had no choice but to follow it to the letter.

    Whatever the process, the product is a nightmare described eloquently by Paul Lockhart, a high school mathematics teacher, in his short booklet, A Mathematician's Lament, about the sorry state of high school mathematics. His argument applies almost beat for beat to computer programming.

    Lockhart illustrates our system's sickness by imagining a fun problem, then showing how it might be gutted by educators trying to "cover" more "material."

    Take a look at this picture:

    It's sort of neat to wonder, How much of the box does the triangle take up? Two-thirds, maybe? Take a moment and try to figure it out.

    If you're having trouble, it could be because you don't have much training in real math, that is, in solving open-ended problems about simple shapes and objects. It's hard work. But it's also kind of fun -- it requires patience, creativity, an insight here and there. It feels more like working on a puzzle than one of those tedious drills at the back of a textbook.

    If you struggle for long enough you might strike upon the rather clever idea of chopping your rectangle into two pieces like so:


    Now you have two rectangles, each cut diagonally in half by a leg of the triangle. So there is exactly as much space inside the triangle as outside, which means the triangle must take up exactly half the box!
    This is what a piece of mathematics looks and feels like. That little narrative is an example of the mathematician's art: asking simple and elegant questions about our imaginary creations, and crafting satisfying and beautiful explanations. There is really nothing else quite like this realm of pure idea; it's fascinating, it's fun, and it's free!
    But this is not what math feels like in school. The creative process is inverted, vitiated:
    This is why it is so heartbreaking to see what is being done to mathematics in school. This rich and fascinating adventure of the imagination has been reduced to a sterile set of "facts" to be memorized and procedures to be followed. In place of a simple and natural question about shapes, and a creative and rewarding process of invention and discovery, students are treated to this:


    "The area of a triangle is equal to one-half its base times its height." Students are asked to memorize this formula and then "apply" it over and over in the "exercises." Gone is the thrill, the joy, even the pain and frustration of the creative act. There is not even a problem anymore. The question has been asked and answered at the same time -- there is nothing left for the student to do.
    * * *
    My struggle to become a hacker finally saw a breakthrough late in my freshman year of college, when I stumbled on a simple question:
    If we list all the natural numbers below 10 that are multiples of 3 or 5, we get 3, 5, 6 and 9. The sum of these multiples is 23.

    Find the sum of all the multiples of 3 or 5 below 1000.
    This was the puzzle that turned me into a programmer. This was Project Euler problem #1, written in 2001 by a then much older Colin Hughes, that student of the ORIC-1 who had gone on to become a math teacher at a small British grammar school and, not long after, the unseen professor to tens of thousands of fledglings like myself.

    The problem itself is a lot like Lockhart's triangle question -- simple enough to entice the freshest beginner, sufficiently complicated to require some thought.

    What's especially neat about it is that someone who has never programmed -- someone who doesn't even know what a program is -- can learn to write code that solves this problem in less than three hours. I've seen it happen. All it takes is a little hunger. You just have to want the answer.

    That's the pedagological ballgame: get your student to want to find something out. All that's left after that is to make yourself available for hints and questions. "That student is taught the best who is told the least."

    It's like sitting a kid down at the ORIC-1. Kids are naturally curious. They love blank slates: a sandbox, a bag of LEGOs. Once you show them a little of what the machine can do they'll clamor for more. They'll want to know how to make that circle a little smaller or how to make that song go a little faster. They'll imagine a game in their head and then relentlessly fight to build it.

    Along the way, of course, they'll start to pick up all the concepts you wanted to teach them in the first place. And those concepts will stick because they learned them not in a vacuum, but in the service of a problem they were itching to solve.

    Project Euler, named for the Swiss mathematician Leonhard Euler, is popular (more than 150,000 users have submitted 2,630,835 solutions) precisely because Colin Hughes -- and later, a team of eight or nine hand-picked helpers -- crafted problems that lots of people get the itch to solve. And it's an effective teacher because those problems are arranged like the programs in the ORIC-1's manual, in what Hughes calls an "inductive chain":

    The problems range in difficulty and for many the experience is inductive chain learning. That is, by solving one problem it will expose you to a new concept that allows you to undertake a previously inaccessible problem. So the determined participant will slowly but surely work his/her way through every problem.

    This is an idea that's long been familiar to video game designers, who know that players have the most fun when they're pushed always to the edge of their ability. The trick is to craft a ladder of increasingly difficult levels, each one building on the last. New skills are introduced with an easier version of a challenge -- a quick demonstration that's hard to screw up -- and certified with a harder version, the idea being to only let players move on when they've shown that they're ready. The result is a gradual ratcheting up the learning curve.

    Project Euler is engaging in part because it's set up like a video game, with 340 fun, very carefully ordered problems. Each has its own page, like this one that asks you to discover the three most popular squares in a game of Monopoly played with 4-sided (instead of 6-sided) dice. At the bottom of the puzzle description is a box where you can enter your answer, usually just a whole number. The only "rule" is that the program you use to solve the problem should take no more than one minute of computer time to run.

    On top of this there is one brilliant feature: once you get the right answer you're given access to a forum where successful solvers share their approaches. It's the ideal time to pick up new ideas -- after you've wrapped your head around a problem enough to solve it.

    This is also why a lot of experienced programmers use Project Euler to learn a new language. Each problem's forum is a kind of Rosetta stone. For a single simple problem you might find annotated solutions in Python, C, Assembler, BASIC, Ruby, Java, J and FORTRAN.

    Even if you're not a programmer, it's worth solving a Project Euler problem just to see what happens in these forums. What you'll find there is something that educators, technologists and journalists have been talking about for decades. And for nine years it's been quietly thriving on this site. It's the global, distributed classroom, a nurturing community of self-motivated learners -- old, young, from more than two hundred countries -- all sharing in the pleasure of finding things out.

    * * *

    It's tempting to generalize: If programming is best learned in this playful, bottom-up way, why not everything else? Could there be a Project Euler for English or Biology?

    Maybe. But I think it helps to recognize that programming is actually a very unusual activity. Two features in particular stick out.

    The first is that it's naturally addictive. Computers are really fast; even in the '80s they were really fast. What that means is there is almost no time between changing your program and seeing the results. That short feedback loop is mentally very powerful. Every few minutes you get a little payoff -- perhaps a small hit of dopamine -- as you hack and tweak, hack and tweak, and see that your program is a little bit better, a little bit closer to what you had in mind.

    It's important because learning is all about solving hard problems, and solving hard problems is all about not giving up. So a machine that triggers hours-long bouts of frantic obsessive excitement is a pretty nifty learning tool.

    The second feature, by contrast, is something that at first glance looks totally immaterial. It's the simple fact that code is text.

    Let's say that your sink is broken, maybe clogged, and you're feeling bold -- instead of calling a plumber you decide to fix it yourself. It would be nice if you could take a picture of your pipes, plug it into Google, and instantly find a page where five or six other people explained in detail how they dealt with the same problem. It would be especially nice if once you found a solution you liked, you could somehow immediately apply it to your sink.

    Unfortunately that's not going to happen. You can't just copy and paste a Bob Villa video to fix your garage door.

    But the really crazy thing is that this is what programmers do all day, and the reason they can do it is because code is text.

    I think that goes a long way toward explaining why so many programmers are self-taught. Sharing solutions to programming problems is easy, perhaps easier than sharing solutions to anything else, because the medium of information exchange -- text -- is the medium of action. Code is its own description. There's no translation involved in making it go.

    Programmers take advantage of that fact every day. The Web is teeming with code because code is text and text is cheap, portable and searchable. Copying is encouraged, not frowned upon. The neophyte programmer never has to learn alone.

    * * *

    Garry Kasparov, a chess grandmaster who was famously bested by IBM's Deep Blue supercomputer, notes how machines have changed the way the game is learned:
    There have been many unintended consequences, both positive and negative, of the rapid proliferation of powerful chess software. Kids love computers and take to them naturally, so it's no surprise that the same is true of the combination of chess and computers. With the introduction of super-powerful software it became possible for a youngster to have a top- level opponent at home instead of needing a professional trainer from an early age. Countries with little by way of chess tradition and few available coaches can now produce prodigies.
    A student can now download a free program that plays better than any living human. He can use it as a sparring partner, a coach, an encyclopedia of important games and openings, or a highly technical analyst of individual positions. He can become an expert without ever leaving the house.

    Take that thought to its logical end. Imagine a future in which the best way to learn how to do something -- how to write prose, how to solve differential equations, how to fly a plane -- is to download software, not unlike today's chess engines, that takes you from zero to sixty by way of a delightfully addictive inductive chain.

    If the idea sounds far-fetched, consider that I was taught to program by a program whose programmer, more than twenty-five years earlier, was taught to program by a program.
    Last edited by sygeek; 04-06-2011 at 06:17 AM.
    [B]AMD FX 6300 | Asus M5A97 Evo R2.0 | Saophire HD 7870 Ghz Edition | Dell S2204L | WD Blue 1TB | Kingston HyperX Blu 4GB | Seasonic S12ii 520W | Samsung 24x DVDRW | Corsair 200R | Dell Keyboard | Logitech MX518 | Edifer X600 | Microtek 1kVA UPS[/B]

  5. #5
    Your Ad here nisargshah95's Avatar
    Join Date
    Feb 2010
    Location
    Goa, India
    Posts
    397

    Thumbs up Re: The Geeks Daily

    Quote Originally Posted by SyGeek View Post
    0.1 + 0.2 !== 0.3
    For those who want to know why it's like this, go to 14. Floating Point Arithmetic: Issues and Limitations — Python v2.7.1 documentation

    Quote Originally Posted by SyGeek View Post
    By James Somers

    Spoiler:


    When Colin Hughes was about eleven years old his parents brought home a rather strange toy. It wasn't colorful or cartoonish; it didn't seem to have any lasers or wheels or flashing lights; the box it came in was decorated, not with the bust of a supervillain or gleaming protagonist, but bulleted text and a picture of a QWERTY keyboard. It called itself the "ORIC-1 Micro Computer." The package included two cassette tapes, a few cords and a 130-page programming manual.

    On the whole it looked like a pretty crappy gift for a young boy. But his parents insisted he take it for a spin, not least because they had just bought the thing for more than £129. And so he did. And so, he says, "I was sucked into a hole from which I would never escape."

    It's not hard to see why. Although this was 1983, and the ORIC-1 had about the same raw computing power as a modern alarm clock, there was something oddly compelling about it. When you turned it on all you saw was the word "Ready," and beneath that, a blinking cursor. It was an open invitation: type something, see what happens.

    In less than an hour, the ORIC-1 manual took you from printing the word "hello" to writing short programs in BASIC -- the Beginner's All-Purpose Symbolic Instruction Code -- that played digital music and drew wildly interesting pictures on the screen. Just when you got the urge to try something more complicated, the manual showed you how.

    In a way, the ORIC-1 was so mesmerizing because it stripped computing down to its most basic form: you typed some instructions; it did something cool. This was the computer's essential magic laid bare. Somehow ten or twenty lines of code became shapes and sounds; somehow the machine breathed life into a block of text.

    No wonder Colin got hooked. The ORIC-1 wasn't really a toy, but a toy maker. All it asked for was a special kind of blueprint.

    Once he learned the language, it wasn't long before he was writing his own simple computer games, and, soon after, teaching himself trigonometry, calculus and Newtonian mechanics to make them better. He learned how to model gravity, friction and viscosity. He learned how to make intelligent enemies.

    More than all that, though, he learned how to teach. Without quite knowing it, Colin had absorbed from his early days with the ORIC-1 and other such microcomputers a sense for how the right mix of accessibility and complexity, of constraints and open-endedness, could take a student from total ignorance to near mastery quicker than anyone -- including his own teachers -- thought possible.

    It was a sense that would come in handy, years later, when he gave birth to Project Euler, a peculiar website that has trained tens of thousands of new programmers, and that is in its own modest way the emblem of a nascent revolution in education.


    * * *

    Sometime between middle and high school, in the early 2000s, I got a hankering to write code. It was very much a "monkey see, monkey do" sort of impulse. I had been watching a lot of TechTV -- an obscure but much-loved cable channel focused on computing, gadgets, gaming and the Web -- and Hackers, the 1995 cult classic starring Angelina Jolie in which teenaged computer whizzes, accused of cybercrimes they didn't commit, have to hack their way to the truth.

    I wanted in. So I did what you might expect an over-enthusiastic suburban nitwit to do, and asked my mom to drive me to the mall to buy Ivor Horton's 1,181-page, 4.6-pound Beginning Visual C++ 6. I imagined myself working montage-like through the book, smoothly accruing expertise one chapter at a time.

    What happened instead is that I burned out after a week. The text itself was dense and unsmiling; the exercises were difficult. It was quite possibly the least fun I've ever had with a book, or, for that matter, with anything at all. I dropped it as quickly as I had picked it up.

    Remarkably I went through this cycle several times: I saw people programming and thought it looked cool, resolved myself to learn, sought out a book and crashed the moment it got hard.

    For a while I thought I didn't have the right kind of brain for programming. Maybe I needed to be better at math. Maybe I needed to be smarter.

    But it turns out that the people trying to teach me were just doing a bad job. Those books that dragged me through a series of structured principles were just bad books. I should have ignored them. I should have just played.

    Nobody misses that fact more egregiously than the American College Board, the folks responsible for setting the AP Computer Science high school curriculum. The AP curriculum ought to be a model for how to teach people to program. Instead it's an example of how something intrinsically amusing can be made into a lifeless slog.


    I imagine that the College Board approached the problem from the top down. I imagine a group of people sat in a room somewhere and asked themselves, "What should students know by the time they finish this course?"; listed some concepts, vocabulary terms, snippets of code and provisional test questions; arranged them into "modules," swaths of exposition followed by exercises; then handed off the course, ready-made, to teachers who had no choice but to follow it to the letter.

    Whatever the process, the product is a nightmare described eloquently by Paul Lockhart, a high school mathematics teacher, in his short booklet, A Mathematician's Lament, about the sorry state of high school mathematics. His argument applies almost beat for beat to computer programming.

    Lockhart illustrates our system's sickness by imagining a fun problem, then showing how it might be gutted by educators trying to "cover" more "material."

    Take a look at this picture:

    It's sort of neat to wonder, How much of the box does the triangle take up? Two-thirds, maybe? Take a moment and try to figure it out.

    If you're having trouble, it could be because you don't have much training in real math, that is, in solving open-ended problems about simple shapes and objects. It's hard work. But it's also kind of fun -- it requires patience, creativity, an insight here and there. It feels more like working on a puzzle than one of those tedious drills at the back of a textbook.

    If you struggle for long enough you might strike upon the rather clever idea of chopping your rectangle into two pieces like so:


    Now you have two rectangles, each cut diagonally in half by a leg of the triangle. So there is exactly as much space inside the triangle as outside, which means the triangle must take up exactly half the box!

    But this is not what math feels like in school. The creative process is inverted, vitiated:

    * * *
    My struggle to become a hacker finally saw a breakthrough late in my freshman year of college, when I stumbled on a simple question:

    This was the puzzle that turned me into a programmer. This was Project Euler problem #1, written in 2001 by a then much older Colin Hughes, that student of the ORIC-1 who had gone on to become a math teacher at a small British grammar school and, not long after, the unseen professor to tens of thousands of fledglings like myself.

    The problem itself is a lot like Lockhart's triangle question -- simple enough to entice the freshest beginner, sufficiently complicated to require some thought.

    What's especially neat about it is that someone who has never programmed -- someone who doesn't even know what a program is -- can learn to write code that solves this problem in less than three hours. I've seen it happen. All it takes is a little hunger. You just have to want the answer.

    That's the pedagological ballgame: get your student to want to find something out. All that's left after that is to make yourself available for hints and questions. "That student is taught the best who is told the least."

    It's like sitting a kid down at the ORIC-1. Kids are naturally curious. They love blank slates: a sandbox, a bag of LEGOs. Once you show them a little of what the machine can do they'll clamor for more. They'll want to know how to make that circle a little smaller or how to make that song go a little faster. They'll imagine a game in their head and then relentlessly fight to build it.

    Along the way, of course, they'll start to pick up all the concepts you wanted to teach them in the first place. And those concepts will stick because they learned them not in a vacuum, but in the service of a problem they were itching to solve.

    Project Euler, named for the Swiss mathematician Leonhard Euler, is popular (more than 150,000 users have submitted 2,630,835 solutions) precisely because Colin Hughes -- and later, a team of eight or nine hand-picked helpers -- crafted problems that lots of people get the itch to solve. And it's an effective teacher because those problems are arranged like the programs in the ORIC-1's manual, in what Hughes calls an "inductive chain":

    The problems range in difficulty and for many the experience is inductive chain learning. That is, by solving one problem it will expose you to a new concept that allows you to undertake a previously inaccessible problem. So the determined participant will slowly but surely work his/her way through every problem.

    This is an idea that's long been familiar to video game designers, who know that players have the most fun when they're pushed always to the edge of their ability. The trick is to craft a ladder of increasingly difficult levels, each one building on the last. New skills are introduced with an easier version of a challenge -- a quick demonstration that's hard to screw up -- and certified with a harder version, the idea being to only let players move on when they've shown that they're ready. The result is a gradual ratcheting up the learning curve.

    Project Euler is engaging in part because it's set up like a video game, with 340 fun, very carefully ordered problems. Each has its own page, like this one that asks you to discover the three most popular squares in a game of Monopoly played with 4-sided (instead of 6-sided) dice. At the bottom of the puzzle description is a box where you can enter your answer, usually just a whole number. The only "rule" is that the program you use to solve the problem should take no more than one minute of computer time to run.

    On top of this there is one brilliant feature: once you get the right answer you're given access to a forum where successful solvers share their approaches. It's the ideal time to pick up new ideas -- after you've wrapped your head around a problem enough to solve it.

    This is also why a lot of experienced programmers use Project Euler to learn a new language. Each problem's forum is a kind of Rosetta stone. For a single simple problem you might find annotated solutions in Python, C, Assembler, BASIC, Ruby, Java, J and FORTRAN.

    Even if you're not a programmer, it's worth solving a Project Euler problem just to see what happens in these forums. What you'll find there is something that educators, technologists and journalists have been talking about for decades. And for nine years it's been quietly thriving on this site. It's the global, distributed classroom, a nurturing community of self-motivated learners -- old, young, from more than two hundred countries -- all sharing in the pleasure of finding things out.

    * * *

    It's tempting to generalize: If programming is best learned in this playful, bottom-up way, why not everything else? Could there be a Project Euler for English or Biology?

    Maybe. But I think it helps to recognize that programming is actually a very unusual activity. Two features in particular stick out.

    The first is that it's naturally addictive. Computers are really fast; even in the '80s they were really fast. What that means is there is almost no time between changing your program and seeing the results. That short feedback loop is mentally very powerful. Every few minutes you get a little payoff -- perhaps a small hit of dopamine -- as you hack and tweak, hack and tweak, and see that your program is a little bit better, a little bit closer to what you had in mind.

    It's important because learning is all about solving hard problems, and solving hard problems is all about not giving up. So a machine that triggers hours-long bouts of frantic obsessive excitement is a pretty nifty learning tool.

    The second feature, by contrast, is something that at first glance looks totally immaterial. It's the simple fact that code is text.

    Let's say that your sink is broken, maybe clogged, and you're feeling bold -- instead of calling a plumber you decide to fix it yourself. It would be nice if you could take a picture of your pipes, plug it into Google, and instantly find a page where five or six other people explained in detail how they dealt with the same problem. It would be especially nice if once you found a solution you liked, you could somehow immediately apply it to your sink.

    Unfortunately that's not going to happen. You can't just copy and paste a Bob Villa video to fix your garage door.

    But the really crazy thing is that this is what programmers do all day, and the reason they can do it is because code is text.

    I think that goes a long way toward explaining why so many programmers are self-taught. Sharing solutions to programming problems is easy, perhaps easier than sharing solutions to anything else, because the medium of information exchange -- text -- is the medium of action. Code is its own description. There's no translation involved in making it go.

    Programmers take advantage of that fact every day. The Web is teeming with code because code is text and text is cheap, portable and searchable. Copying is encouraged, not frowned upon. The neophyte programmer never has to learn alone.

    * * *

    Garry Kasparov, a chess grandmaster who was famously bested by IBM's Deep Blue supercomputer, notes how machines have changed the way the game is learned:

    A student can now download a free program that plays better than any living human. He can use it as a sparring partner, a coach, an encyclopedia of important games and openings, or a highly technical analyst of individual positions. He can become an expert without ever leaving the house.

    Take that thought to its logical end. Imagine a future in which the best way to learn how to do something -- how to write prose, how to solve differential equations, how to fly a plane -- is to download software, not unlike today's chess engines, that takes you from zero to sixty by way of a delightfully addictive inductive chain.

    If the idea sounds far-fetched, consider that I was taught to program by a program whose programmer, more than twenty-five years earlier, was taught to program by a program.
    Great article buddy. Keep posting! I guess we should start a thread where we discuss Euler's problems What say?
    Last edited by nisargshah95; 04-06-2011 at 02:01 PM.
    "The nature of the Internet and the importance of net neutrality is that innovation can come from everyone."
    1. System: HP 15-r022TX {Intel i5 4210U | 8GB RAM | NVIDIA GeForce 820M | Ubuntu 14.04 Trusty Tahr LTS 64-bit (Primary) + Windows 8.1 Pro 64-bit | 1TB HDD + 1TB Seagate Backup Plus external HDD | Dell KB212 Keyboard | Dell MS111 Mouse | Belkin 4 port Sting Ray USB Hub}
    Twitter - https://twitter.com/nisargshah95

  6. #6
    Wise Old Owl sygeek's Avatar
    Join Date
    Apr 2011
    Location
    Lucknow
    Posts
    1,876

    Default Re: The Geeks Daily

    Great article buddy. Keep posting! I guess we should start a thread where we discuss Euler's problems What say?
    Sure, but no one looks interested in it and so I didn't bother creating one. Also, Euler's forums already have a section dedicated to this, so it doesn't make much sense unless you guys want a familiar community discussion to this.
    Last edited by sygeek; 04-06-2011 at 03:06 PM.
    [B]AMD FX 6300 | Asus M5A97 Evo R2.0 | Saophire HD 7870 Ghz Edition | Dell S2204L | WD Blue 1TB | Kingston HyperX Blu 4GB | Seasonic S12ii 520W | Samsung 24x DVDRW | Corsair 200R | Dell Keyboard | Logitech MX518 | Edifer X600 | Microtek 1kVA UPS[/B]

  7. #7
    Your Ad here nisargshah95's Avatar
    Join Date
    Feb 2010
    Location
    Goa, India
    Posts
    397

    Thumbs up Re: The Geeks Daily

    Quote Originally Posted by SyGeek View Post
    Sure, but no one looks interested in it and so I didn't bother creating one. Also, Euler's forums already have a section dedicated to this, so it doesn't make much sense unless you guys want a familiar community discussion to this.
    Oh. Anyways don't stop postin the articles. They're good.

    BTW Yay! I solved the first problem - If we list all the natural numbers below 10 that are multiples of 3 or 5, we get 3, 5, 6 and 9. The sum of these multiples is 23.
    Find the sum of all the multiples of 3 or 5 below 1000. Did it using JavaScript (and Python console for calculations).
    Last edited by nisargshah95; 04-06-2011 at 06:46 PM.
    "The nature of the Internet and the importance of net neutrality is that innovation can come from everyone."
    1. System: HP 15-r022TX {Intel i5 4210U | 8GB RAM | NVIDIA GeForce 820M | Ubuntu 14.04 Trusty Tahr LTS 64-bit (Primary) + Windows 8.1 Pro 64-bit | 1TB HDD + 1TB Seagate Backup Plus external HDD | Dell KB212 Keyboard | Dell MS111 Mouse | Belkin 4 port Sting Ray USB Hub}
    Twitter - https://twitter.com/nisargshah95

  8. #8
    Wise Old Owl sygeek's Avatar
    Join Date
    Apr 2011
    Location
    Lucknow
    Posts
    1,876

    Default Hate Java? You’re fighting the wrong battle.


    Spoiler:
    One of the most interesting trends I’ve seen lately is the unpopularity of Java around blogs, DZone and others. It seems some people are even offended, some even on a personal level, by suggesting the Java is superior in any way to their favorite web 2.0 language.

    Java has been widely successful for a number of reasons:
    • It’s widely accepted in the established companies.
    • It’s one of the fastest languages.
    • It’s one of the most secure languages.
    • Synchronization primitives are built into the language.
    • It’s platform independent.
    • Hotspot is open source.
    • Thousands of vendors exist for a multitude of Java products.
    • Thousands of open source libraries exist for Java.
    • Community governance via that JCP (pre-Oracle).

    This is quite a resume for any language, and it shows, as Java has enjoyed a long streak as being one of the most popular languages around.
    So, why suddenly, in late 2010 and 2011, is Java suddenly the hated demon it is?
    It’s popular to hate Java.
    • C-like syntax is no longer popular.
    • Hate for Oracle is being leveraged to promote individual interests.
    • People have been exposed to really bad code, that’s been written in Java.
    • … insert next hundred reasons here.

    Java, the actual language and API, does have quite a few real problems… too many to list here (a mix of native and object types, an abundance of abandoned APIs, inconsistent use of checked exceptions). But I’m offering an olive branch… Lets discuss the real problem and not throw the baby out with the bath water.

    So what is the real problem in the this industry? Java, with its faults, has completely conquered web application programming. On the sidelines, charging hard, new languages are being invented at a rate that is mind-blowing, to also conquer web application programming. The two are pitted together, and we’re left with what looks a bunch of preppy mall-kids battling for street territory by break dancing. And while everyone is bickering around whether PHP or Rails 3.1 runs faster and can serve more simultaneous requests, there lurks a silent elephant in the room, which is laughing quietly as we duke it out in childish arguments over syntax and runtimes.

    Tell me, what do the following have in common?
    • Paying with a credit card.
    • Going to the emergency room.
    • Adjusting your 401k.
    • Using your insurance card at the dentist.
    • Shopping around for the best car insurance.
    • A BNSF train pulling a Union Pacific coal car.
    • Transferring money between banks.
    • Filling a prescription.

    All the above industries are billion dollar players in our economy. All of the above industries write new COBOL and mainframe assembler programs. I’m not making this up, I work in the last industry, and I’ve interviewed and interned in the others.

    For god sakes people, COBOL, invented in 1959, is still being written today, for real! We’re not talking maintaining a few lines here and there, we’re talking thousands of new lines, every day, to implement new functionality and new requirements. These industries haven’t even caught word the breeze has shifted to the cloud. These industries are essential; they form the building blocks of our economy. Despite this, they do not innovate and they carry massive expenses with their legacy technology. The costs of running business are enormous, and a good percentage of those are IT costs.

    How expensive? Lets talk about mainframe licensing, for instance. Lets say you buy the Enterprise version of MongoDB and put in on a box. You then proceed to peg out the CPU doing transaction after transaction to the database… The next week, you go on vacation, and leave MongoDB running without doing a thing. How much did MongoDB cost in both weeks? The same.

    Mainframes software is licensed much different. Lets say you buy your mainframe for a couple million and buy a database product for it. You then spend all week pegging the CPU(s) with database requests. You check your mail, and you now have a million dollar bill from the database vendor. Wait, I bought the hardware, why am I paying another bill? The software on a mainframe is often billed by usage, or how many CPU cycles you spend using it. If you spend 2,000,000 cpu cycles running the database, you will end up owing the vendor $2mil. Bizzare? Absolutely!

    These invisible industries you utilize every day are full of bloat, legacy systems, and high costs. Java set out to conquer many fronts, and while it thoroughly took over the web application arena, it fizzled out in centralized computing. These industries are ripe for reducing costs and becoming more efficient, but honestly, we’re embarrassing ourselves. These industries stick with their legacy systems because they don’t think Ruby, Python, Scala, Lua, PHP, Java could possibly handle the ‘load’, scalability, or uptime requirements that their legacy systems provide. This is so far from the truth, but again, there has been 0 innovation in the arenas in the last 15 years, despite the progress of web technology making galaxy-sized leaps.

    So next week someone will invent another DSL that makes Twitter easier to use, but your bank will be writing new COBOL to more efficiently transfer funds to another Bank. We’re embarrassing ourselves with our petty arguments. There is an entire economy that needs to see the benefits of distributed computing, but if the friendly fire continues, we’ll all lose. Lest stop these ridiculous arguments, pass the torch peacefully, and conquer some of these behemoths!
    [B]AMD FX 6300 | Asus M5A97 Evo R2.0 | Saophire HD 7870 Ghz Edition | Dell S2204L | WD Blue 1TB | Kingston HyperX Blu 4GB | Seasonic S12ii 520W | Samsung 24x DVDRW | Corsair 200R | Dell Keyboard | Logitech MX518 | Edifer X600 | Microtek 1kVA UPS[/B]

  9. #9
    Broken In
    Join Date
    May 2009
    Location
    BANgalore
    Posts
    116

    Default Re: The Geeks Daily

    thanks for posting this article "How I Failed, Failed, and Finally Succeeded at Learning How to Code". been going through it
    http://bugup.co.cc

  10. #10
    Wise Old Owl sygeek's Avatar
    Join Date
    Apr 2011
    Location
    Lucknow
    Posts
    1,876

    Default Re: The Geeks Daily

    By Alex Schiff, University of Michigan

    Spoiler:

    A month ago, I turned down a very good opportunity from a just-funded startup to continue my job for the rest of the summer. It was in an industry I was passionate about, I would have had a leadership position and having just received a raise, the pay would have been substantially higher than most jobs for 20-year-old college students. I had worked there for a year (full-time during last summer and part-time during the school year) and common sense should have pushed me to go back.

    But I didn’t.

    I’ve never been one to base my actions on others’ expectations. Just ask my dad, with whom I was having arguments about moral relativism by the time I was 13. That’s why I didn’t think twice about the implications of turning down an opportunity most people my age would kill for to start my own company. When you take a leap of faith of that magnitude, you can’t look back.

    That’s not how the rest of the world sees it, though. As a college student, I’m expected to spend my summers either gaining experience in an internship or working at some job (no matter how menial) to earn money. Every April, the “So where are you working this summer?” conversation descends on the University of Michigan campus like a storm cloud. When I told people I was foregoing a paycheck for at least the next several months to build a startup, the reactions were a mix of confusion and misinformed assumptions that I couldn’t land a “real job.”

    This sentiment surfaced recently with a conversation with a family member that asserted I needed to “pay my dues to society” by joining the workforce. And most adults I know tell me I need to get a real job first before starting my own company. One common thought is, “Most of the world has to wait until they’re at least 40 before they can even think about doing something like that. Why should you be any different?” It almost feels like people assume we have some sort of secular “original sin” that demands I work for someone else before I do what makes me happy. Even when I talk to peers who don’t understand entrepreneurship, their reaction can be subtle condescension and comments like, “Oh that’s cool, but you’re going to get a real job next summer or when you graduate, right?”

    This is my real job. Building startups is what I want to do with my life, preferably as a founder. I’m really bad at working for other people. I have no deference to authority figures and have never been shy to voice my opinions, oftentimes to my detriment. I also can’t stand waiting on people that are in higher positions than me. It makes me feel like I should be in their place and really gets under my skin. All this makes me terrible at learning things from other people and taking advice. I need to learn by doing things and figuring out how to solve problems by myself. I’ll ask questions later.

    As a first-time founder, I can’t escape admitting that starting fetchnotes is an immense learning experience. I’m under no illusion that I have any idea what I’m doing. I’m thankful I had a job where I learned a lot of core skills on the fly — recruiting, business development, management, a little sales and a lot about culture creation. But what I learned — and what most people learn in generalist, non-specialized jobs available to people our age — was the tip of the iceberg.

    When you start something from scratch, you gain a much deeper understanding of these skills. Instead of being told, “We need Drupal developers. Go find Drupal developers here, here and here,” you need to brainstorm the best technical implementation of your idea, figure out what skills that requires and then figure out how to reach those people. Instead of being told, “Go reach out to these people for partnerships to do X, Y and Z,” you need to figure out what types of people and entities you’ll need to grow and how to convince them to do what you need them to do. When you’re an employee, you learn the “what”, when you’re a founder, you learn the “how” and “why.” You need to learn how to rally and motivate people and create a culture in a way that just isn’t remotely the same as a later-hired manager. There are at least 50 orders of magnitude in the difference between the strategic and innovative thinking required by a founder and that of even the most integral first employee.

    Besides, put yourself in an employer’s shoes. You’re interviewing two college graduates — one who started a company and can clearly articulate why it succeeded or failed, and one who had an internship from a “brand name” institution. If I’m interviewing with someone who chooses the latter candidate, they’re not a place I want to work for. It’s likely a “do what we tell you because you’re our employee” working environment. And if that sounds like someone you want to work for, this article is probably irrelevant to you anyway.

    That’s why I never understood the argument about needing to get a job or internship as a “learning experience” or to “pay your dues.” There’s no better learning experience than starting with nothing and figuring it out for yourself (or, thankfully for me, with a co-founder). And there’s no better time to start a company than as a student. When else will your bills, foregone wages and cost of failure be so low? If I fail right now, I’ll be out some money and some time. If I wait until I’m out of college, have a family to support and student loans to pay back, that cost could be being poor, hungry and homeless.

    Okay, maybe that’s a little bit of hyperbole, but you get my point. If you have a game-changing idea, don’t make yourself wait because society says you need an internship every summer to get ahead. To quote a former boss, “just **** it out.”

    Alex Schiff is a co-founder of The New Student Union.
    [B]AMD FX 6300 | Asus M5A97 Evo R2.0 | Saophire HD 7870 Ghz Edition | Dell S2204L | WD Blue 1TB | Kingston HyperX Blu 4GB | Seasonic S12ii 520W | Samsung 24x DVDRW | Corsair 200R | Dell Keyboard | Logitech MX518 | Edifer X600 | Microtek 1kVA UPS[/B]

Similar Threads

  1. Geeks,Help me to buy 5.1 speakers around 5k
    By RohanAJoshi in forum Audio
    Replies: 15
    Last Post: 12-02-2011, 11:06 PM
  2. Daily SMS !
    By Batistabomb in forum Chit-Chat
    Replies: 3
    Last Post: 05-12-2007, 03:40 PM
  3. Help me geeks...!
    By damritraj in forum Software Q&A
    Replies: 5
    Last Post: 29-06-2007, 10:11 AM
  4. to all geeks out there.
    By quark in forum QnA (read only)
    Replies: 2
    Last Post: 02-01-2005, 01:25 PM
  5. Have U join the IRC Yet (Geeks Only)!!!
    By Ashis in forum QnA (read only)
    Replies: 21
    Last Post: 17-11-2004, 12:50 AM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •