Programming (explanations for the non-programmers, mostly)

Discussion in 'General Chatter' started by seebs, May 3, 2015.

  1. Codeless

    Codeless Cheshire Cat

    Ok, that explanation makes sense, thank you.

    @Adm You pretty much lost me, you´re talking programmer again.

    Also in my defense i mentioned before in this thread that I absolutely suck at math. (I actually suspect dyscalculia, but have no idea how to find out)

    @Mentarnes Thanks. Sorry no non rude way to give this info, but you answered a question i didn´t ask because i already knew the answer.
     
    Last edited: May 4, 2015
  2. Lissiel

    Lissiel Dreaming dead

    Trying my hand at answering, more to try to think through my own understanding than to offer actual response. Hopefully someone will look at this answer and tell us both where im full of shit/just straight up wrong.

    You're telling the computer you have a thing called 'a'. Ok, the computer says, what type of thing is it? The computer knows about numbers, characters, boolean algebra values (true or false), and possibly some other stuff idk. 'Its a number' you tell the computer, and now it knows if you tell it a=b, you mean 'the number value of variable b' and not say 'the actual letter b'.

    Boolean algebra. 'A' can be either true or false. If a is true, then b. If b is true then c. If a is false, d. If c, then a. If c, then not d.

    http://en.m.wikipedia.org/wiki/Boolean_algebra
     
    • Like x 2
  3. seebs

    seebs Benevolent Dictator

    "Primitive" is used in a lot of programming contexts to refer to sort of fundamental things which are their own self-contained thing not defined in terms of other things. So for instance, you might refer to a "circle" as a primitive. Now say you make a venn diagram, with two circles that overlap, and you refer to the shape of the overlap. That's a not-a-primitive. So it's sort of contextual. In terms of types, often "innate" types, like the kinds of numbers the hardware can store, might be "primitives". If you're talking about code, sometimes fundamental operations are called "primitives".
     
  4. Mala

    Mala Well-Known Member

    I did mention earlier that I only knew a bit of programming...
    also a tiny dog has claimed one arm and i cant chicken peck decent explanations out fast enough XD
     
    • Like x 1
  5. Codeless

    Codeless Cheshire Cat

    @seebs Yep, makes sense. Primitives are things which are not made up of other things.
    @Mentarnes Ah yes, pets. They get in the way....

    I hope I´m not being anoying, was just having fun playing question and explanation science.
    Results so far: I got several answers I understand, one answer I did not understand, and two answers to questions I did not ask. Now I kinda wish I had a bunch of people in another field I know nothing about to try that on.
     
    • Like x 2
  6. seebs

    seebs Benevolent Dictator

    One of the things it tests for, I think, is whether you say "well, if this means anything, it must mean something else". But! Also worth noting that once people have spent a couple of weeks studying and learned about assignment and so on... Nothing changes. So even once people have been told about =, the approximate percentage of people who get each kind of answer doesn't change. Which is strange to me.
     
  7. Morven

    Morven In darkness be the sound and light

    Though note that they're not saying that people don't learn; they're saying that the people with a consistent mental model at the beginning (even if wrong) still have a consistent mental model at the end (possibly/probably now right). It's the people without a consistent mental model who still don't at the end.
     
  8. seebs

    seebs Benevolent Dictator

    Depends on the language, and that one's not really in a specific language so I don't know. But in a lot, "int a = 10" is telling the computer about a.

    I don't like "meaningless", but what I think they're trying to get to is, there's no inherent conceptual backing behind it. "a = 20" isn't saying a meaningful thing about something there are 20 of. It doesn't mean 20 eggs. It doesn't mean "more than nineteen". It's just a raw number with no inherent meaning. The point being, if you give a computer the wrong instructions, it can't say "wait, that doesn't make any sense", because there's no sense to be made. (This is not always 100% true, mind.)
     
  9. seebs

    seebs Benevolent Dictator

    Yeah. If your brain is in the habit of reshuffling things and trying to figure them out, and build a model of what you're seeing, you get better at it. If you aren't, it's hard to teach that.
     
  10. Starcrossedsky

    Starcrossedsky Burn and Refine

    Yeah, that's exactly why I'm questioning how effective that test is. Now that I know that = in this context does not mean = in the familiar context, the problem's a cinch. And it took all of ten minutes to explain that the last statement actually meant "the value of thing a is now thing b." Even given that I'm quick at those things, it's definitely something that should have improved over the course of a couple weeks.
     
  11. Morven

    Morven In darkness be the sound and light

    All the 'meaning' is assigned by us, not the computer, which is conceptually a really simple device; it can do math, logic, comparisons, move values to different places, and jump around. (It's a pretty complicated device in the interests of doing a huge fuckton of these simple things per second).
     
  12. Exohedron

    Exohedron Doesn't like words

    @Lissiel: Meaningless in that the computer has no idea what integers, a, = or 20 are. The statement means something to us, but to the computer it's just symbols.
    The model I tend to use is Searle's Chinese Room analogy. Picture a room with no doors or windows, only a tiny slot just barely big enough to slip a piece of paper through. Inside the room is a man and a big book and a stack of blank paper and a pen. The man doesn't know any Chinese. Every once in a while, a piece of paper with Chinese characters written on it gets slipped in through the slot. He looks at the characters, matches them up to symbols in the book.
    The book is not a dictionary. It's a set of instructions. All it does is say things like "if you see this symbol and that symbol and not this other symbol, write this symbol on the roll of paper." And when he gets to the end of the characters and the end of the instructions, he sends the bits of paper that he wrote on back out the slot. The man does no thinking; he just looks things up in the book and follows the instructions.
    And the person on the other side of the door, who can read Chinese and has no idea what the room contains, thinks that whoever is in the room must be very wise indeed.
    That's a computer. It does no thinking; it just follows instructions. Meaning is assigned by the people writing and reading the code and examining the results, but the computer only sees meaningless symbols and arbitrary instructions.
     
    • Like x 3
  13. seebs

    seebs Benevolent Dictator

    I feel I should point out: I don't actually buy the Chinese Room, because I don't think that a set of instructions without any comprehension can be expected to reliably do that. I am not sure whether or not computers think, but I am not convinced that they definitely don't.

    And I'd agree that the test question seems sort of odd, but this does match the general pattern observed throughout CS teaching and programming in general; most of the time, people either get it or don't, and stuff like "but the syntax was confusing" appears to have a much smaller effect, even though it seems like it ought to be significant. I am not totally sure why.
     
  14. Morven

    Morven In darkness be the sound and light

    One thing I'm pretty sure on is that classic CS instruction fails a pretty damn large subset of those who try it. Whether it's because their minds just can't do it, or because the teaching's bad, I'm not sure. Probably some degree of both.

    And damn, yes, there were plenty of CS students in my class who Never Got It. Some of whom graduated.
     
  15. Exohedron

    Exohedron Doesn't like words

    I'm kind of the converse; I don't think the Chinese Room is a good thought-experiment for a difference between human and computer cognition because I'm not convinced that humans definitely do think in a way that is distinguishable from the Chinese Room type setup. It's just several levels of emergent phenomena on top of heavily hardware-specific code.
    The point of the Chinese Room is that the man has no idea what he's doing, but there's more to the room than the man; there's the book. The Chinese Room is the entire man-book-pen-paper-slot system, and if there is meaning to be had, it's in the book. The man is just a tool for the book to use. But the analogy has the computer being the man, while the book is a compiler/interpreter created by a human being, so it is allowed to have meaning imparted by its author. The man is still clueless.
     
  16. Lissiel

    Lissiel Dreaming dead

    Do you think there may be social or other biases in the way programming is taught that could account for that? Sort of like how you can teach geometry as abstract numbers and rules or as actual visible/tangible shapes moving through space, and different people will understand it better one way vs the other?
     
  17. seebs

    seebs Benevolent Dictator

    The Chinese Room is mostly a bait-and-switch; it relies on your view of the man as the relevant actor, but stipulates that it is somehow possible to perfectly emulate thinking behavior without him being aware of meaning. And since you're prone to anthropocentrism, you infer that there is no understanding because the person doesn't understand.
     
  18. Exohedron

    Exohedron Doesn't like words

    As an argument for anything in particular I agree that the Chinese Room is a bad argument. As a model I find it useful in terms of breaking apart the "instructions for how to do a thing" and "object that does a thing" very firmly. In particular, it makes a point that the I/O language is not the internal language of the processing unit, so any semantics of the I/O language are not necessarily translated to semantics of the internal language, and that the object that does the thing doesn't necessarily have to understand the semantics of the I/O language in order to do things in the I/O language. Maybe not naturally, maybe not adaptably, maybe only in certain subtopolects of Chinese that can't be considered complete languages, but it can do things.
    I think this is an important distinction to make, between the instructions and the instructee and what they need to understand. The instructor can understand all she wants. The instructee follows orders.
     
  19. Wiwaxia

    Wiwaxia problematic taxon

    Fun facts: you can fool google with genus-species names. I know this because of reasons.

    geology geology geology
     
  20. seebs

    seebs Benevolent Dictator

    That is a good point about the analogy.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice