Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Teaching languages -v- Industrial languages

Options
  • 03-05-2012 12:46pm
    #1
    Registered Users Posts: 40,038 ✭✭✭✭


    [mod note]Split out from this thread[/mod note]
    dazberry wrote: »
    What seems to have happened is that a lot more industry specific technologies now are thought (e.g. no Pascal or Modula 2 etc.), and from what I've seen the industry has got it into its head that it wants grads to now have very specific skillsets that they can use immediately, and I'm guessing those expectations are now not being met?
    Worse, people started to believe that because colleges didn't produce those grads, colleges were doing it wrong. And worse yet, colleges started buying into that.

    That's why you can now - without much effort - find grads with good grades from computer science and computer engineering courses who have never used a command line, or who don't fully grok how you go from source code in an editor window to voltage levels on silicon (and while the last step in that chain might only be of academic interest to most, it's the lack of knowledge of the preceding links in that chain that lead to things like minimum specs for IDEs being 2Gb of RAM...).

    Simple fact is that a college graduate shouldn't have spent their final year learning the latest and greatest tech as part of their courses. As extracurricular learning, cool, and it'd stand to them because it seperates them from the others; but if you try to roll in the latest language into a four-year academic course, you will fail the students for several reasons:
    • You may guess wrong about what the latest tech is,
    • The latest tech you choose may only apply to a small part of the IT sector - so you satisfy one company and the other 219 grads find they're not so qualified for jobs elsewhere,
    • The demonstrators and technicians running the labs, and the lecturer teaching the course, won't have as deep a knowledge of the tech as they might if you didn't have so much churn in the curriculum,
    • You'll be spending time and effort learning a new language that you could have spent learning something else - either learning an existing language or paradigm better, or learning a new paradigm (rather than a new language), or learning something else that's more general and more useful in the long haul.

    Personally, I think that college grads shouldn't be taught the latest fashion, or even industrial languages like Java and C# (which are not designed for teaching principles, but are designed to get the job done quickly, and those goals are almost mutually exclusive). If you want to teach the procedural language paradigm, I still think Pascal is the better language, even though C and C++ are the more common languages used in industry for that paradigm. Likewise, C++ is *not* the language to use to teach OO programming, even though it and Java are the most common OO languages used in industry. This whole drive to use industrial tools in colleges just ignores the point that college has a different goal to industry (and a different client to meet that goal for). Industry wants products and sales this quarter for the company's profit margins and the shareholder's returns; college is meant to provide its graduates with skills for the rest of their professional careers, which means teaching the fundamentals well.


«1

Comments

  • Registered Users Posts: 71 ✭✭zdragon


    Sparks wrote: »
    industrial languages like Java and C# (which are not designed for teaching principles, but are designed to get the job done quickly, and those goals are almost mutually exclusive). ...l.
    wrong wrong wrong ..
    please tell me that you are not a programmer.otherwise I'll be very disappointed.


  • Registered Users Posts: 7,157 ✭✭✭srsly78


    popcorn.gif


  • Registered Users Posts: 71 ✭✭zdragon


    I am in IT industry for long time. I'll tell you frank . years of experience do not matter. I've seen enough of "experienced" arse scratchers and hard working grads .

    I don't want to mention grads who do not have a clue about programming.

    and even more, programming is not about the language , it's about your ability to think, the logic and how you can approach a problem and come up with a solution.

    unfortunately many young people (not only grads) have lack of ambition and passion for programming.

    as said above. while looking for job get an open source project-> become a commiter and You'll find out how many things you will learn and get experience.

    Regarding employers, well, even after 10 years in IT you may get canned answers such as "not experienced enough" , "other candidates had better match"
    employers are looking for cheap labour capable to work 24/7.

    legacy code, fire fighting, stubborn managers, revenue focused directors who do not see the value in non-sales staff - all of these are the dark side of the moon/IT .

    even spending 5 years in such environment may not give a graduate the necessary experience

    and yes, traing. nobody voice what sort of training are they expecting from an employer? C#? .Net ? DB Design? query optimising? then why somebody needs to spend 3-4 years in college?


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    zdragon wrote: »
    wrong wrong wrong ..
    please tell me that you are not a programmer.otherwise I'll be very disappointed.
    No, I work in farming. People just get confused when I start talking about AI and the next thing I knew, I was modding this forum.


  • Closed Accounts Posts: 2,696 ✭✭✭mark renton


    I'd ban him Sparks - he's just knocked 15k off your fiscal value


  • Advertisement
  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    I was wondering why my broker was trying to call me earlier alright...


  • Registered Users Posts: 13,104 ✭✭✭✭djpbarry


    zdragon wrote: »
    wrong wrong wrong ..
    please tell me that you are not a programmer.otherwise I'll be very disappointed.
    Java is a messy language that leads to bad habits. I learned that the hard way.


  • Registered Users Posts: 71 ✭✭zdragon


    djpbarry wrote: »
    Java is a messy language that leads to bad habits. I learned that the hard way.

    messy? you haven't seen C++ :)

    not the language, people.


    My point is .. a college should teach how to build software, how to think in a language, eg. messy Java :)

    not how to write a procedure in a language.


  • Registered Users Posts: 20,913 ✭✭✭✭Stark


    zdragon wrote: »
    wrong wrong wrong ..
    please tell me that you are not a programmer.otherwise I'll be very disappointed.

    Might be more helpful to say why you disagree rather than calling the poster stupid. All you've given so far is a barely coherent opinionated rant.

    ⛥ ̸̱̼̞͛̀̓̈́͘#C̶̼̭͕̎̿͝R̶̦̮̜̃̓͌O̶̬͙̓͝W̸̜̥͈̐̾͐Ṋ̵̲͔̫̽̎̚͠ͅT̸͓͒͐H̵͔͠È̶̖̳̘͍͓̂W̴̢̋̈͒͛̋I̶͕͑͠T̵̻͈̜͂̇Č̵̤̟̑̾̂̽H̸̰̺̏̓ ̴̜̗̝̱̹͛́̊̒͝⛥



  • Closed Accounts Posts: 4,436 ✭✭✭c_man


    zdragon wrote: »
    messy? you haven't seen C++ :)

    not the language, the people.

    I read your post as above and got highly offended for a moment :o

    I'm very well groomed :p


  • Advertisement
  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    zdragon wrote: »
    messy? you haven't seen C++ :)
    not the language, people.
    Take it from someone who's taught courses in college for eight years at undergrad and postgrad level using C, C++, Java, Erlang and assembly; and who's worked in industry for seven years using C, C++, Java, PHP, Perl, and other industrial languages; the language makes a major difference. It won't cure bad habits if you use a good language for the task; but if you don't, it can start them.

    At one point, the engineering course in TCD switched to teaching in Java. I fought with the guys teaching the course for months over it, and finally gave up and quit TAing the course the following year, because it became impossible to teach the labs or set projects in it (the lecturer in TCD teaches the theory; the TA teaches the practice). "Go write a linked list" resulted in 200-odd people turning in a call to java.util.LinkedList. Ban them from using java.util.LinkedList and you got 200-odd reimplementations of the java.util.LinkedList API using Vectors or other pre-made data structures.

    That right there is the core of the problem. Industrial languages are there for people who know what they're doing, providing shortcuts to get things done fast, and they're deeply successful and popular in the faster-faster-faster-sellupandgetout startup culture; but they're worse than useless to teach people the fundamentals with, and I'd go so far as to say that in some cases, they're actively harmful. They're like powertools in a beginners carpentry class; yes, professionals can't work without them, but a beginner is supposed to have the time to use a saw instead of removing their thumb with a table saw.

    And then you go out into industry and find codebases with millions of lines of code, all written by people taught using shortcut languages in shortcut IDEs with no idea how their code gets compiled, how it works in the ecosystem the OS provides, who make lousy coding and programming decisions because of this ignorance, and you get to live with the consequences of that.
    My point is .. a college should teach how to build software, how to think in a language, eg. messy Java :)
    not how to write a procedure in a language.
    That wasn't the point being argued, and it's wrong anyway:
    • the point being argued was that it is easier to teach someone a programming concept in one language than in another, and if taught well, you can go on to learn how to implement that concept in most other languages more easily than learning it from scratch in those languages;
    • colleges should never teach someone how to think in just one language. Someone who only thinks in assembly has a hard time with OOP. Someone who only thinks in Java has a hard time with Erlang. And so on. What colleges teach is how to think in paradigms (and they try to teach more than one). If you come out of college only able to think in Java, you've been fleeced of a good education, and your worth to the industry is lower as a result and your long-term career is pretty much locked into either the management track or changing careers.


  • Registered Users Posts: 13,104 ✭✭✭✭djpbarry


    zdragon wrote: »
    not the language, people.
    True. I guess I should have said it's easier to develop bad habits coding in Java and it does surprise me that some degree courses use Java as a model language - C is much better suited to teaching fundamentals.


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    djpbarry wrote: »
    C is much better suited to teaching fundamentals.
    Well :)
    I'd say C was perfect for teaching systems programming, for teaching memory manipulation and how data structures are mapped in memory; I'd want to teach someone assembly for a few weeks first, so that some of C's concepts like pointers, are already taught to them using a more appropriate language.

    And for teaching procedural programming, I still believe Pascal hasn't been much improved on; Python has few competitors for teaching late-binding OOP (though Objective C would be one of those few); Erlang is quite excellent for teaching functional programming; and I've not yet found a good language for teaching early-binding OOP :D Lisp and Haskell are great for expanding the number of paradigms that students are exposed to, and Ruby (so long as Rails isn't involved) looks to me to be a great way to push the top few percent of the class, but I've not had the chance to test that hypothesis.

    Fortran, ASP.net, C# and a few others can go die in a fire, as far as teaching is concerned :D


  • Closed Accounts Posts: 3,298 ✭✭✭Duggys Housemate


    Its a bit academic to teach a language which will never be used i.e. Pascal.


  • Registered Users Posts: 13,104 ✭✭✭✭djpbarry


    Sparks wrote: »
    Well :)
    I'd say C was perfect for teaching systems programming, for teaching memory manipulation and how data structures are mapped in memory; I'd want to teach someone assembly for a few weeks first, so that some of C's concepts like pointers, are already taught to them using a more appropriate language.
    Fair point - I did a little assembly programming during an undergrad project and it certainly helped me get my head around some of the stuff Mr. Turing was banging on about.
    Sparks wrote: »
    And for teaching procedural programming, I still believe Pascal hasn't been much improved on; Python has few competitors for teaching late-binding OOP (though Objective C would be one of those few); Erlang is quite excellent for teaching functional programming; and I've not yet found a good language for teaching early-binding OOP :D Lisp and Haskell are great for expanding the number of paradigms that students are exposed to, and Ruby (so long as Rails isn't involved) looks to me to be a great way to push the top few percent of the class, but I've not had the chance to test that hypothesis.
    I really wish I discovered this forum years ago.


  • Registered Users Posts: 71 ✭✭zdragon


    Stark wrote: »
    Might be more helpful to say why you disagree rather than calling the poster stupid. All you've given so far is a barely coherent opinionated rant.

    my disagreement is with "even industrial languages like Java and C# (which are not designed for teaching principles"

    those languages are built on principles and any Java/C# book for dummies will contain all principles of OOP like old school C++.

    the other side that harms are frameworks behind those languages.

    Sparks wrote: »
    Go write a linked list" resulted in 200-odd people turning in a call to java.util.LinkedList. Ban them from using java.util.LinkedList and you got 200-odd reimplementations of the java.util.LinkedList API using Vectors or other pre-made data structures.

    so then we can blame Java for that not the TA?

    and why do we need procedural programming?

    I'd say better to introduce DI and DDD along with any OOP capable language.


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    Its a bit academic to teach a language which will never be used i.e. Pascal.
    No, it's not. The point is not to give you a language you can use for industrial programming (though, back in the day, Pascal did build a few useful things); the point is to give you a mental tool to use to think about how to solve a problem (in this case, procedural programming). I wouldn't use Pascal to teach someone OOP or systems programming or functional programming (I don't think you even could teach those things in it), but for this one task, it's the best approach I've encountered.

    And we're talking here about taking a few months (really, you don't need more than one semester) to teach something very fundamental, properly. Against a career that can see you actively coding for forty-odd years or more and where your code or its effects can hang around for even longer, that's a good investment.


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    zdragon wrote: »
    my disagreement is with "even industrial languages like Java and C# (which are not designed for teaching principles"
    those languages are built on principles and any Java/C# book for dummies will contain all principles of OOP like old school C++.
    No, that's not factually correct.
    Java was not built to teach OOP; it was built to write OOP on one platform and run it on multiple platforms (it didn't manage to do that, but it was the objective). It was intended from the outset for industrial uses.

    C++ was not written to teach OOP either; it was written to add OO constructs to C, and it wasn't the first attempt to do so; Objective-C did the same thing, but took the late-binding rather than C++'s early-binding approach to it. And then when the language wars kicked off in earnest, every feature ever thought of was shoehorned into C++ until it became known for being unknowable. It wasn't even written using the underlying C language style as a decision aimed at furthering the original goal, it was written using the underlying C style so that people would think they wouldn't need to learn much to learn OOP at a time when OOP was the latest craze and people were trying to get rich selling compilers and the latest compiler was a front-page article in Byte (and that should date me fairly effectively :D).
    so then we can blame Java for that not the TA?
    I'm sure you can avoid that trap :rolleyes:
    and why do we need procedural programming?
    I'd say better to introduce DI and DDD along with any OOP capable language.
    ...and you've just proven my point for me.

    There's no such thing as a silver bullet. You might have heard that before, but I think you missed the point, which was that there is no such thing as a single approach or a single tool which works in all situations. Procedural programming accounts for more active lines of code out in industry today than any other; it's the main paradigm used in embedded programming (and embedded programming is the elephant in the room that no-one seems to talk about). OOP is great at what it does and it's a vital tool in the toolbox, but it is not the only one. OOP is like a really well-crafted, well-balanced hammer - in that it's not the best screwdriver around.

    So today, it's DDD (actually, today it's TDD, but that's just driving the point home till it's countersunk). Yesterday it was OOP. Before that it was procedural and modular. The latest fad is always changing, and it is never a good basis for a long-term career. Just ask any buggy-whip maker. And you can't tell what'll be next because there are always dozens of competing ideas, and by the time they settle into mainstream use, there are dozens more which are coming up behind them. So when doing college degrees, you ignore the bleeding edge and teach the fundamentals. And you use the best tool for that job when doing that job. Which is rarely an industrial language.


  • Registered Users Posts: 1,215 ✭✭✭carveone


    And then you go out into industry and find codebases with millions of lines of code, all written by people taught using shortcut languages in shortcut IDEs with no idea how their code gets compiled,

    What Joel Spolsky called the Shlemiel the painter problem! Where you are doomed to make awful mistakes (like appending 100 strings using strcat for example) because you don't understand the underlying algorithms.
    Sparks wrote: »
    Well :)
    I'd say C was perfect for teaching systems programming, for teaching memory manipulation and how data structures are mapped in memory; I'd want to teach someone assembly for a few weeks first, so that some of C's concepts like pointers, are already taught to them using a more appropriate language...... snip

    The answer for anyone who asks "what language should I learn" (I don't mean just C here - I mean the whole post!). The old book "Data structures and algorithms" (Aho, Hopcruft, Ullman) which I got in a sale for 2 quid years ago has all the algorithms explained in Pascal pseudocode (with pointers). That doesn't mean you can't implement those in any language you choose. The important bit is the concepts... Pointers in C is quite trivial if you know them in assembly - if talking about them I introduce asm first.
    Fortran, ASP.net, C# and a few others can go die in a fire, as far as teaching is concerned :D

    I did Fortran in UCD in '89 because it was an engineering language. Some Fortran dialect (Watfiv I think) that wasn't even Fortran 77 compliant. Even had to leave the first 7 characters blank because that's the way the punched cards were. Bleugh.

    As an assembly programmer first I indeed had issues with OOP. Until Python. Made the whole thing a bit clearer :o

    I wonder what it was like to work on the computer I saw last week in Manchester - store was a magnetic drum, memory was the fading phosphor on a CRT. Nice!


  • Registered Users Posts: 71 ✭✭zdragon


    Sparks wrote: »
    No, that's not factually correct.
    Java was not built to teach OOP; ...
    C++ was not written to teach OOP either;


    So today, it's DDD (actually, today it's TDD, but that's just driving the point home till it's countersunk). Yesterday it was OOP. Before that it was procedural and modular.

    you may be surprised , but pascal was not invented with the purpose to teach someone something..

    DDD vs TDD :) you are killing me, I am out of here .

    good luck teaching modular programming to students
    but I am not hiring anyone who got no clue about OOP and data modeling
    and excellent at writing algorithms in Pascal.


  • Advertisement
  • Registered Users Posts: 1,215 ✭✭✭carveone


    Sparks wrote: »
    it was built to write OOP on one platform and run it on multiple platforms (it didn't manage to do that, but it was the objective).

    I could argue (having worked for Sun) that its objective was to commoditise the things that Sun existed to produce - hardware. Looked like it succeeded in that respect. Although the dotcom crash and the hardware price crash really nailed them.
    And then when the language wars kicked off in earnest, every feature ever thought of was shoehorned into C++ until it became known for being unknowable.

    A friend of mine worked in C++ for 10 years and his opinions on the language are a bit rude! He now works with PHP and CodeIgniter. There's only so much you can take.
    There's no such thing as a silver bullet. You might have heard that before, but I think you missed the point, which was that there is no such thing as a single approach or a single tool which works in all situations.

    Unless you're a HP sales exec writing to the Irish Times for the 20th time complaining that schools need to buy more (HP) PCs running Windows to "Enable the next generation of ICT professionals to prevent future shortages of IT staff blah blah blah". Sure. Nothing to do with padding your personal account with government money.

    The thing that made 80s teens into 90s programmers wasn't PCs but the availability of cheap hardware like the Spectrum that kids could plug into the TV. Combined with the zero gap between the machine and a programming language (turn on, there you go), it encouraged experimentation like never before. It certainly had nothing to do with highlighting a piece of text and making it bold in MS Word.

    PCs: The first PC I ever saw had "NO ROM BASIC" on its screen. I was so impressed. I'd argue vociferously that programming for DOS did more damage to me as a programmer than using various odd dialects of 1980s BASIC ever did (except BBC BASIC with it's inbuilt 6502 assembler). I still pause if allocating more than 64K because my subconscious is screaming about 16 bit limits and far pointers. Remember aligned pointers where you adjusted the segment to make the offset zero? Jesus...
    (and embedded programming is the elephant in the room that no-one seems to talk about).

    Ok I take that back about DOS. Embedded programming is all about limits :p


  • Registered Users Posts: 586 ✭✭✭Aswerty


    carveone wrote: »
    I could argue (having worked for Sun) that its objective was to commoditise the things that Sun existed to produce - hardware. Looked like it succeeded in that respect. Although the dotcom crash and the hardware price crash really nailed them.

    I'm not familiar with why Sun produced Java but I would expect that Sun tried to commoditise enterprise programming languages to allow them to ship more hardware. Commoditising your own product doesn't make sense unless you have another product that you can up sale from it which is what I expect was the reasoning behind Java.


  • Registered Users Posts: 1,215 ✭✭✭carveone


    Aswerty wrote: »
    I'm not familiar with why Sun produced Java but I would expect that Sun tried to commoditise enterprise programming languages to allow them to ship more hardware. Commoditising your own product doesn't make sense unless you have another product that you can up sale from it which is what I expect was the reasoning behind Java.

    It was a comment from Joel Spolsky that rang true to me at the time. I had a hard time trying to understand Sun's long term strategy at any point - they were trying to do free software and ship hardware and then they also had this Write Once Run Anywhere strategy and at the same time trying to go the desktop route. I think they made the mistake of not being able to see past their hate/fear of MS, otherwise they would have stayed with their strengths and not tried to compete on the desktop. They kept shifting strategy directions. Then the dotcom implosion really burned them. When I left Sun, they had two buildings and 300 staff in Eastpoint. A few years later they were letting everyone go...


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    zdragon wrote: »
    you may be surprised , but pascal was not invented with the purpose to teach someone something..
    Wirth would have disagreed with you rather strongly, and even a quick wikipedia search would have told you this, or reading any introduction to the language. I'm getting the impression that you're not arguing from knowledge here zdragon.
    but I am not hiring anyone who got no clue about OOP and data modeling and excellent at writing algorithms in Pascal.
    To be honest, I don't think that's going to be a major worry to anyone with a good foundation in programming, because they're going to get snapped up for higher pay from other places. The top few percent of CS and CEng courses usually are.


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    carveone wrote: »
    I could argue (having worked for Sun) that its objective was to commoditise the things that Sun existed to produce
    I would have argued that that was what Sun pushed it to become, but it wasn't what it was originally supposed to do (whereas Pascal got pushed to do other things but was originally intended as a teaching language).
    Looked like it succeeded in that respect.
    I'm not sure Javachips were ever that successful to be honest, which was (I thought) a shame - they were a really interesting idea.
    A friend of mine worked in C++ for 10 years and his opinions on the language are a bit rude!
    I'm working in it now (again!) and I'd probably agree with his opinions :D
    Unless you're a HP sales exec writing to the Irish Times for the 20th time complaining that schools need to buy more (HP) PCs running Windows to "Enable the next generation of ICT professionals to prevent future shortages of IT staff blah blah blah". Sure. Nothing to do with padding your personal account with government money.
    Yup.
    The thing that made 80s teens into 90s programmers wasn't PCs but the availability of cheap hardware like the Spectrum that kids could plug into the TV. Combined with the zero gap between the machine and a programming language (turn on, there you go), it encouraged experimentation like never before. It certainly had nothing to do with highlighting a piece of text and making it bold in MS Word.
    It definitely created a huge amount of interest, but I wish it hadn't been BASIC they'd used; even a good version of it like QuickBASIC 4.5 would have been better (yeah, it's still BASIC, but it was a damn sight nicer than the C64 variant of BASIC).

    Which does rather point out the language part of the argument again.
    But I'd rather see lots of people widely using a bad language than nobody at all using a good one!
    I still pause if allocating more than 64K
    *applause*
    Ok I take that back about DOS. Embedded programming is all about limits :p
    Yeah, but the limits in embedded programming are far stricter and more quirky :D Ever hit the stack limit accidentally when programming in C on a PIC because of a library call you didn't scope properly and which the compiler didn't catch? And it's run-time too, so you'd wind up debugging with a logic analyser if you weren't rich enough for the ICE :D


  • Registered Users Posts: 2,018 ✭✭✭Colonel Panic


    I'd always test someone on their problem solving skill and knowledge of the application of data structure and algorithms before I'd ask them if they know about Dependency Injection or Test Driven Design.

    I use the latter quite heavily as well as the former, but I've always thought knowing the former helped me realise that the latter are by no means a silver bullet.


  • Registered Users Posts: 11,977 ✭✭✭✭Giblet


    Not to mention the abuse that goes on in each by seasoned professionals! (ie: Service Locator Anti-Pattern)


  • Registered Users Posts: 1,215 ✭✭✭carveone


    Sparks wrote: »
    It definitely created a huge amount of interest, but I wish it hadn't been BASIC they'd used; even a good version of it like QuickBASIC 4.5 would have been better (yeah, it's still BASIC, but it was a damn sight nicer than the C64 variant of BASIC).

    There was Forth on the Jupiter Ace but that wasn't very popular. I came to basic again a few years later (in 1992 I suppose) using Turbo Basic which was a darn sight nicer.

    Commodore basic - print "♥" to clear the screen wasn't it? Yeah, that made sense...
    Yeah, but the limits in embedded programming are far stricter and more quirky :D Ever hit the stack limit accidentally when programming in C on a PIC because of a library call you didn't scope properly and which the compiler didn't catch? And it's run-time too, so you'd wind up debugging with a logic analyser if you weren't rich enough for the ICE :D

    Oh. Given that I'm now exploring C on PICs and ARM, I'm going to write that post down and pin it to something! I haven't actually used C on the smaller PICs at any stage (eg: pic16f84 or pic16f628) but it's probably necessary by the time you hit pic18f2550 etc. I was aware of the stack limit but wasn't in any danger of hitting it in manual asm. Thanks for telling me that !


  • Closed Accounts Posts: 5,064 ✭✭✭Gurgle


    Sparks wrote: »
    you'd wind up debugging with a logic analyser if you weren't rich enough for the ICE :D
    Hah, if you're really rolling out production code you end up debugging with an oscilloscope because the logic analyser and ICE can't tell you everything.


  • Advertisement
  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    Gurgle wrote: »
    Hah, if you're really rolling out production code you end up debugging with an oscilloscope because the logic analyser and ICE can't tell you everything.
    Ah, xkcd(378) :D

    real_programmers.png


Advertisement