KWSN Orbiting Fortress Forum Index KWSN Orbiting Fortress
KWSN Distributed Computing Teams forum
 
 FAQFAQ   SearchSearch   MemberlistMemberlist   UsergroupsUsergroups   RegisterRegister 
 ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

Time For a Break

 
Post new topic   Reply to topic    KWSN Orbiting Fortress Forum Index -> The Bragging Yard
View previous topic :: View next topic  
Author Message
The King's Own
Baron
Baron


Joined: 09 Jun 2012
Posts: 232
Location: Harbor Springs, Michigan

PostPosted: Fri Jul 20, 2012 3:01 pm    Post subject: Time For a Break Reply with quote

Made 10,000,000 #ni-1 Once GPUgrid hits 10 mill I am shutting down to read Linux manuals. and build Shrubber 2.0



_________________
What are you going to do, bleed on me?

Back to top
View user's profile Send private message Send e-mail
Putting_things_on_top
Duke
Duke


Joined: 14 Oct 2009
Posts: 435
Location: Frostbite Falls, Minnesota, USA

PostPosted: Sat Jul 21, 2012 4:07 pm    Post subject: Re: Time For a Break Reply with quote

The King's Own wrote:
Made 10,000,000 #ni-1 Once GPUgrid hits 10 mill I am shutting down to read Linux manuals. and build Shrubber 2.0



Is that Armenian currency?

Might I suggest comparing 3 'flavors' of Linux (if you're still deciding):
-- Ubuntu (seems to have the most active/aggressive development group)
-- Mint (noted for its stability and 'lean & cautious' distro)
-- Puppy (noted for its ease-of-implementation; driver support may be shaky).

I'd be interested to hear from you (and others?) as to what you see as a pros-vs-cons for various Linux distros.
This topic might be worthy of a separate thread under 'Help Scrolls' or 'Science News'.

It is no big secret that CPU shrubbing/crunching under Linux is significantly & measurably faster than under Windows.

My preliminary preference is for Mint, but I've just barely started to look into Linux.
I'm not too comfortable in making any recommendations because I'm not even qualified to be a Linux noob, yet (a little knowledge is a dangerous thing).

BTW, Concrete-mixing Moose seems to be quite pleased with Puppy Linux.
_________________
Click here for...KWSN F@H team summary at EOC

Or here for...KWSN F@H team overtake at EOC


Back to top
View user's profile Send private message Visit poster's website
Sir Papa Smurph
Cries like a little girl
Prince


Joined: 18 Jul 2006
Posts: 4430
Location: Michigan

PostPosted: Sat Jul 21, 2012 4:44 pm    Post subject: Reply with quote

I've tried Ubuntu a couple 3 times. Driver installation is not ready for prime time. Other than that I loved it, but if I can't run my video cards, then I don't need it...

#ni-1
_________________
a.k.a. Licentious of Borg.........Resistance Really is Futile.......
and a Really Hoopy Frood who always knows where his Towel is...
Back to top
View user's profile Send private message
Nuadormrac
Prince
Prince


Joined: 13 Sep 2009
Posts: 506

PostPosted: Sat Jul 21, 2012 6:10 pm    Post subject: Reply with quote

It's been awhile, since I did a seperate boot under Linux... This said, the last time I did load it, it was a distro of SLAMD 64, which was essentially Slackware for AMD 64 processors (which in reality would be any x86-64 proc, it's just that Intel's own processors using the x64 instruction set were yet to come out when that was first developed #ni-1

In some ways though, it little matters what distro one starts from in some ways, if one's really out for max performance, which means going to kernel.org, grabbing the lattest source, and compiling one's own kernel while setting the build to give max performance benefits to one's own specific hardware. After compliling the kernel specific to one's machine, the installation is really no longer as it came out of box... Compiling one's kernel does require some knowledge on how toos however, and one doesn't want to mess up (or not have a backup to the old kernel), when one's then setting the boot up in Lilo, or one could end up with a system that won't boot #ni-1
_________________


.
Back to top
View user's profile Send private message
Putting_things_on_top
Duke
Duke


Joined: 14 Oct 2009
Posts: 435
Location: Frostbite Falls, Minnesota, USA

PostPosted: Sat Jul 21, 2012 7:16 pm    Post subject: Reply with quote

Sir Papa Smurph wrote:
I've tried Ubuntu a couple 3 times. Driver installation is not ready for prime time. Other than that I loved it, but if I can't run my video cards, then I don't need it...

This seems to be the bane of most Linux distros.
Getting ATI and NVidia to provide good drivers (actually tested/QA'd) that will work well under Linux is a large part of the problem.
Even Linus Torvalds is frustrated with chipset/hardware manufacturers [specifically NVidia].
(click image for youtube video - warning: F-bomb!)...

Discussed here at: http://www.linuxcandy.com/2012/06/the-nvidia-fiasco.html

The other part of the problem is the installation of those drivers.

Many of the Linux forums have chronic complaints (and not just from the noobs) about managing hardware, drivers, etc.
Apparently (from what I've read), you have to:
-- Uninstall the default/generic drivers,
-- Install the device-specific ones, and THEN...
-- Muck around with manually editing (as many as a dozen) config files.
This can be a daunting process!
Especially when you need to type in dozens of commands to achieve this.
And even then, it's a total craps-shoot as to whether or not it will actually work.

The dyed-in-the-wool "Linux-or-Death" crowd is far too enamored with their precious console-level command methodologies to allow too much GUI (pronounced "rational") development to proceed in this area.
The quandary, however, is that these dinosaurs & prima-donnas are also the ones with the most knowledge & insight of how the Linux framework actually works (or ought to).

Like it or not, the current Windows "plug-n-play" approach (which was copied from Apple) makes things a lot easier.
Plug-n-play has obviously evolved from (what we used to call) the days of plug-n-Pray!
Easier - yes! More resource-efficient - no!

And this seems to be one of the more salient points of divergence between Windows and Linux:
Most of the various Linux distros are customized/enhanced versions of (and extensions to) the essential Linux OS core-framework.
The intent of Linux (and we all know where that road leads) is to provide an OS that is "everything you need and nothing you don't".
Windows, on the contrary, throws almost everything together..."just in case you might need it someday"...and thus: BLOATWARE.

Another unfortunate reality is that Microsoft has obscene amounts of money they can put behind their development efforts.
Linux - for the most part - is freeware; and far too many Linux implementers have to operate on obscenely meager budgets.

So, when I find a stable, lean, easy-to-use, and flexible Linux distro...well, I will drop Windows like a used condom!
And I will donate $150~200 for each copy I download and actually use (it's just the right thing to do).

But, so far, I have yet to find a Linux flavor that sufficiently meets my needs! [or anemic skill-level] Sad

So, the research continues...

_________________
Click here for...KWSN F@H team summary at EOC

Or here for...KWSN F@H team overtake at EOC


Back to top
View user's profile Send private message Visit poster's website
Concrete-mixing Moose
Prince
Prince


Joined: 30 Apr 2012
Posts: 567
Location: The Joyce Grenfell Home for the Distressed

PostPosted: Sat Jul 21, 2012 11:51 pm    Post subject: Reply with quote

All depends what you want the OS for. Just for DC-ing? A little OS like Puppy is enough. Runs in RAM, no need for a HD just set up the internet connection and you're off!
If you want HD TV and all the bells and whistles then it's a whole new ball game.
_________________

Save my home - click every day on the picture!
Back to top
View user's profile Send private message Send e-mail
Putting_things_on_top
Duke
Duke


Joined: 14 Oct 2009
Posts: 435
Location: Frostbite Falls, Minnesota, USA

PostPosted: Sun Jul 22, 2012 12:49 am    Post subject: Reply with quote

Concrete-mixing Moose wrote:
All depends what you want the OS for. Just for DC-ing? A little OS like Puppy is enough. Runs in RAM, no need for a HD just set up the internet connection and you're off!
If you want HD TV and all the bells and whistles then it's a whole new ball game.

Do you know if Puppy supports the latest ATI 79xx or NVidia 6xx GPU cards?
If so, I'll definitely give it a second look.

For CPU-only shrubbing, Puppy seems like a no-brainer choice. Small, fast, easy, and efficient - seems ideal!
And keeping-up with new CPUs is probably not as problematic (for Linux developers) since the fundamental architecture & instruction-set does not change nearly as often or dramatically as it does in the GPU market segment.
But for those of us who want to add some serious GPU shrubbing, the driver/hardware support is the critical decision point.

I did browse the Puppy site this past week.
I like their philosophy, at least in principle.
But it seems to me that they may have too many irons-in-the-fire, and might have difficulty trying to keep up with competing or conflicting demands from their own end-user community.
In Linux/Unix lingo: They might be forking around too much.
This is an all-too-common plight for these kinds of ventures.

I am slightly more intrigued by Mint.
Their focus is on the fundamentals, and they take a more conservative approach to updates/builds/releases.
They intentionally avoid the latest-n-greatest, leading-edge sort of stuff.
They want to distribute an OS that is solid, needs very little support, and has "field-proven" components/modules.
The plight faced by this approach, however, is being perceived as too slow-on-the-uptake, techno-arrogance, or narrow-mindedness.

My own hesitation with Mint is trying to figure-out how long it will take them to have a version that will support my hardware. Shocked


_________________
Click here for...KWSN F@H team summary at EOC

Or here for...KWSN F@H team overtake at EOC


Back to top
View user's profile Send private message Visit poster's website
Concrete-mixing Moose
Prince
Prince


Joined: 30 Apr 2012
Posts: 567
Location: The Joyce Grenfell Home for the Distressed

PostPosted: Sun Jul 22, 2012 1:54 pm    Post subject: Reply with quote

Puppy is only 32 bit (it was intended for old machines originally), FatDog is 64 bit Puppy see http://distro.ibiblio.org/fatdog/web/. As for cards I'll have to see ...

Edit Nvidia drivers from April at http://murga-linux.com/puppy/viewtopic.php?p=617029#617029 scroll down to Speedyluck's post in April

Edit AMD/ATI 7900 drivers at http://linux.softpedia.com/get/System/Hardware/AMD-Catalyst-Linux-Driver-Radeon-HD-7900-Series-80043.shtml
_________________

Save my home - click every day on the picture!


Last edited by Concrete-mixing Moose on Sun Jul 22, 2012 4:11 pm; edited 1 time in total
Back to top
View user's profile Send private message Send e-mail
Putting_things_on_top
Duke
Duke


Joined: 14 Oct 2009
Posts: 435
Location: Frostbite Falls, Minnesota, USA

PostPosted: Sun Jul 22, 2012 4:03 pm    Post subject: The old wind-bag (me) is at it again... Reply with quote

Nuadormrac wrote:
...
In some ways though, it little matters what distro one starts from in some ways, if one's really out for max performance, which means going to kernel.org, grabbing the latest source, and compiling one's own kernel while setting the build to give max performance benefits to one's own specific hardware..
...

Yes, conceptually that's what can be done.
But I'm certain that there are not too many folks that have the confidence in their own [Linux] skill-level to attempt this kind of personalized build.
And for those of us with virtually no Linux skills, the risk of totally hosing it up is too great.

I'm sure there's a handful of our Knights (probably no more than 6?) that do have the knowledge & skills to make a successful go of it.
The rest of us, however, ...meh!

I think the head guy at Puppy Linux (Barry Kauler) is really on to something that is sorely needed...
He is envisioning/designing an OS configurator (named "Woof") to allow end-users to use guided-selection options to create their own builds.
See discussion at: http://distro.ibiblio.org/quirky/racy-5.3/release-Racy-5.3.htm
But he round-aboutly admits that the number of 'options' are getting to be overwhelming (inter-dependencies, mutual-exclusivities, conflict-avoidances, etc).
In fact, I am interested enough in this project that I am likely to volunteer to do some beta-testing for them (once they have a sufficiently decent app).

The old 1970s & 1980s paradigm of attempting to squeeze as much performance from the hardware is an ancient, bygone notion ("Hardware is expensive, people are not").
The new paradigm is "Hardware is cheap, people are not".
This means that as long as the code is reasonably efficient, it is much more cost-effective to resolve nominal performance issues with more/better hardware than with extensive re-programming efforts.
The on-going issue is in how we define "reasonably efficient", and where we will need to draw the line at the point-of-diminishing-returns.

Hardware cannot always sufficiently compensate for really horrible, inefficient, and downright stupid coding.
In those cases, hardware merely serves as a bandage, splint, or crutch.

Programming is an art-form. And what the software artist strives for is: efficiency + stability + adaptability + effectiveness = elegance.

Point being: most of us don't have the time, patience, dedication, and/or interest in becoming expert enough with Linux to create/support our own builds.

I am ambivalent about Windows-7.
On the one hand: it is fairly easy to set-up, not always too terrible to tweak, and has tons of available 3rd party applications.
On the other hand: it is notoriously bloated, sometimes unreasonably sluggish, and unfortunately prone to viruses/malware.

But I am somewhat hesitant with Linux.
On the positive side: it holds out the promise of being more hardware efficient, potentially less prone to viruses/malware, and is gaining significant worldwide acceptance.
On the worrying side: it's "organic" (too many 'flavors' to choose from), it's often developed/supported by [transient] volunteers, it can be somewhat more complicated to install & maintain, and 3rd-party software is lagging (although slowly catching-up).

I know that the advent of Win-8 will cause a water-shed event (split) in the larger computing community.
I am one of those that have already decided that I will convert to Linux rather than endure the limitations & constraints that will be [necessarily] imposed under Win-8.
If I wanted that kind of 'safety', I could just-as-well get a Mac.


_________________
Click here for...KWSN F@H team summary at EOC

Or here for...KWSN F@H team overtake at EOC


Back to top
View user's profile Send private message Visit poster's website
Nuadormrac
Prince
Prince


Joined: 13 Sep 2009
Posts: 506

PostPosted: Sun Jul 22, 2012 5:23 pm    Post subject: Re: The old wind-bag (me) is at it again... Reply with quote

Putting_things_on_top wrote:
Nuadormrac wrote:
...
In some ways though, it little matters what distro one starts from in some ways, if one's really out for max performance, which means going to kernel.org, grabbing the latest source, and compiling one's own kernel while setting the build to give max performance benefits to one's own specific hardware..
...

Yes, conceptually that's what can be done.
But I'm certain that there are not too many folks that have the confidence in their own [Linux] skill-level to attempt this kind of personalized build.
And for those of us with virtually no Linux skills, the risk of totally hosing it up is too great.


Well, it is true I have a degree in computer networking... Actually if I were maneage to get myself back into school (though finances have been a biatch the last 4 years), I'd actually be looking for an undergrad degree in computer security, which leads to a graduate degree in homeland security. A program at Drexel is actually a good fit from what I had taken previously. But there is money to work out also #ni-1

It is funny in a way to mention compiling a kernel and what not as something I did many years ago; and we're talking about how uncomfortable people could be at doing it. It didn't seem that bad, and with some instruction; but there would be the possibility to hose things if one isn't careful, perhaps less an issue though on a dual boot unless someone inadvertantly removed the option to boot to Windows from Lilo. Albeit the Windows install disk could restore that to let one begin again, by just doing a repair, repplace with it's own boot sector....

Quote:
The old 1970s & 1980s paradigm of attempting to squeeze as much performance from the hardware is an ancient, bygone notion ("Hardware is expensive, people are not").


In part, this does depend on the developers, and also how they're developing. There comes a point where using a scripted, or even compiled language (such as C++ for instance) won't produce the same level of performance of directly coding something in assembly. Truth be told, I don't think all developers today know assembler, and for those who do, there is time constraints.

There can also be a matter of having pure software guys, who really don't understand hardware in depth, and pure hardware guys who know some programming yes, but haven't really gone too far in depth with some of the new programming languages. And even here, know can be relative. I knew a computer engineer who worked at NASA for 11 years (around the 1970s), who from some discussions I think was on the design team that helped construct the Voyager space probes. He also worked on the design of the ATM network standard. Some conversations with him were interesting to say in the least, and went into things that, well in college lectures one didn't tend to get. To put it bluntly, he went beyond many of various computing models, and must have also had a degree (perhaps a second post graduate degree) in physics also. He could drill right down to what was going on with the electron, and what was going on in matter itself, and tell out right when, where, how, and why matter might sometimes behave different from conventional theory, with it's implications away from what this or that model suggests "should happen".

But then he could also look at a motherboard or something, and pretty much predict it's life span even before testing the thing, and give an indication of how the product will die. And if asked, he could get right down to the geometry on the board, the physics involved, and explain what will happen with the material it's made out of, over time; and his predictions tended to have a degree of accuracy to them, as things would play out during use.... I guess something that would have been useful for one designing space probes that would have left the orbit of Earth; for obvious reasons...

Now in some areas (I think John Carmack once stated) that with some tightly nested loops, and some areas of his code that was very performance critical he tended to embed code which he hand assembled, into his compiled program; which with a FPS type game would also be ideal of course, for one who doesn't want to lag and get their arse fragged Laughing But then all developers aren't John Carmack, and I'm not sure that all today would necessarily have the skillset to do that....

The person I mentioned above also attested to some of the same, suggesting that there are few developers with enough expertise in both hardware and software to be very good at writing BIOS code for the various motherboards, and as such a company can be hard pressed to find enough skilled people who can code software, to control the hardware, to be able to double check the code of the first guy to detect for instance when the BIOS inserts a C7 into one of the control registers, rather then a C8; which of course with the wrong setting could result in the hardware acting in somewhat unexpected ways... Device drivers, naturally could also be an area where this comes up; to say nothing of the support depts at Intel and Microsoft, who can know their hardware or their software, but without familiarity of the other, aren't very good at supporting complete systems. Problem, customers don't just run hardware, or just run software, they run systems that need to integrate and work together....

[QUOTE]Hardware cannot always sufficiently compensate for really horrible, inefficient, and downright stupid coding.
In those cases, hardware merely serves as a bandage, splint, or crutch.

Programming is an art-form. And what the software artist strives for is: efficiency + stability + adaptability + effectiveness = elegance.]/QUOTE]

There's cost cutting on both sides, along with a diminishing of quality all around. And it's happened with both hardware manufacturers (hell look at the 440 LX chipset #ni-1 ) and software. Not to turn this political, but outsorucing, along with moving assembly plants to third world countries where cheap labor is possible is also a sign of it. But, and the above mentioned person also noticed the same crop up in a change of materials, where sometimes some of the plastics they'd replace ceramic or what have you with, just can't withstand the same stresses, or last as long under normal operating conditions. And then getting someone who doesn't understand the fundamentals of curcuit design to "connect the dots" in drawing traces for a circuit board, where for instance they might not understand how mistakes in geometry can effect the stability or lifespan of the part.... He used to find a lot of mistakes with how components were positioned relative to one another, and ship boards back with "not gonna use this part"...

I've also heard from some other engineers who sometimes got frustrated when speaking with some pure software guys from their own companies, and they'd try to explain something, and not a clue.... And the one thing I could come back to was with some of the programming classes I had, I'd bring a few things up, and sometimes the professor would know, but would be like "yeah, we figure it's too complex for college students, so decided to leave this, or that bit out. But yeah I'm aware of it" Surprised One such area, look at how they specify the 1 and 0 in binary code. The circuit really is not "on" or "off", but rather is operating in a relative high, or low voltage state. Now when one considers how the hardware discriminates a 1 or 0, and what's happening with the electrons themselves, how a bit error could occur, becomes rather obvious. Because one pumps electricity in, to keep the circuit in a high voltage state, but then again electrons don't "like" to remain in a high voltage state, they tend to dissipate energy (one common form heat, aka hello to all that cooling our computers need) and drop in energy state. If enough electrons don't remain in the higher energy state, what should be a 1, could be mistaken for a 0, hence a bit error that things like ECC could be designed to help detect. With that understanding it makes sense; remove it, and a mechanism by which things can go wrong or fall into error, could be "deemed a mystery".

I think what can end up happening though, is you get a situation where the professors might deem something "beyond the students" and so chose not to teach. But then these students graduate, get into the work force, and unless they picked it up somewhere, that understanding might not have been passed on. It could, just perhaps have had an effect... When I confronted a few teachers, they acknowledged it (with me), but to the class in general, the much simpler model is what they would teach nonetheless....

Hell, something the above mentioned guy mentioned, the Voyagers were far more successful then the Martian lander which crashed into the surface of Mars. But the latter had some cost cutting initiatives, which in saving a few bucks also resulted in the thing crashing and not sending data back; the Voyagers were made at a much different time, and of course, in respect to the fact that we did get images back from them was far more sucessful as a mission, also...

Quote:
I am ambivalent about Windows-7.
On the one hand: it is fairly easy to set-up, not always too terrible to tweak, and has tons of available 3rd party applications.
On the other hand: it is notoriously bloated, sometimes unreasonably sluggish, and unfortunately prone to viruses/malware.


One of the things that also effects Windows, aside from many of the security holes, is that so many people run it, that there's added incentive on the part of exploiters to target it for hacking and virus creation attempts. Believe it or not, back when I was in high school, and they did have anti-virus software for them; there were some viruses which were detected for the platform. If one didn't tend to hear about them as much (and perhaps many aren't aware that we did have a/v software on Macs also, "back in the day") it could be because not as many had it running, to have encountered it, or to have been worth the hackers time to try to go after it as readily...

Linux, arguably IS more secure, and design can have a lot to do with it. But, and this said, even Linux (and UNIX for that matter) does have it's root kits....
_________________


.
Back to top
View user's profile Send private message
Putting_things_on_top
Duke
Duke


Joined: 14 Oct 2009
Posts: 435
Location: Frostbite Falls, Minnesota, USA

PostPosted: Mon Jul 23, 2012 12:22 am    Post subject: Reply with quote

Yeah for me (sans degree), I have a 28+ year career on HP NonStop platforms running [proprietary] Guardian O/S, plus 5 prior years on another platform.
HP NonStop used to be "Tandem Computers, Inc" (before Compaq acquired them, and HP subsequently bought-up Compaq).
So yeah, I'm an old code-grinder & bit-twiddler going back to the early 80s.
And I'm sure that if anyone asked me a [legitimate] "how-to" question about Guardian O/S, my response would be similarly perceived in the fashion that I (and others) react to the Linux-expert responses. Embarassed
Everything is simple and "obvious" to the expert; but to the novice, we're still getting that deer-in-the-headlights feeling! Shocked

Nuadormrac wrote:
: : :
If enough electrons don't remain in the higher energy state, what should be a 1, could be mistaken for a 0, hence a bit error that things like ECC could be designed to help detect. With that understanding it makes sense; remove it, and a mechanism by which things can go wrong or fall into error, could be "deemed a mystery".
: : :
This is why you have systems that use a SECDED method for memory management (single error correction, double error detection) which mostly employs Hamming codes.
This method is sometimes used with 'registered' memory, and that's why it is typically slower but much more reliable (and expensive).

It comes down to a matter of speed vs reliability.
For my own use, I prefer speed (unbuffered memory); and besides, nothing that I do on my personal rigs is "mission critical".
But I would get to a level of verbal hostility towards anyone who would suggest using that same kind of unreliable memory for critical business purposes.

Nuadormrac wrote:
: : :
I think what can end up happening though, is you get a situation where the professors might deem something "beyond the students" and so chose not to teach. But then these students graduate, get into the work force, and unless they picked it up somewhere, that understanding might not have been passed on. It could, just perhaps have had an effect... When I confronted a few teachers, they acknowledged it (with me), but to the class in general, the much simpler model is what they would teach nonetheless....
: : :
"beyond the student" or "beyond the instructor"????
It's been my observation that university professors tend to become insular & out-of-touch with practical, real-world challenges/solutions after not too many years...especially susceptible are those with "tenure".
And then, they begin to try BS-ing their way around the more gifted/inquisitive students (BS does not refer to butterscotch in this context).
Prof. Michael Disney, Cardiff University wrote:
"The presumption of knowledge is the greatest impediment to [scientific] advancement."
Well, at least there is ONE professor who is aware of the insularity problem within academia!!!

Throughout my career, whenever an inexperienced college-grad is hired (into I/T), I make a point of telling him/her to forget 80% of what they learned in college.
"Theory is knowing how things ought to work; Reality is understanding how things actually work - and why", is my pontifical message to these ingenuous, addle-brained noobs.


_________________
Click here for...KWSN F@H team summary at EOC

Or here for...KWSN F@H team overtake at EOC


Back to top
View user's profile Send private message Visit poster's website
Nuadormrac
Prince
Prince


Joined: 13 Sep 2009
Posts: 506

PostPosted: Mon Jul 23, 2012 2:26 am    Post subject: Reply with quote

Putting_things_on_top wrote:
Yeah for me (sans degree), I have a 28+ year career on HP NonStop platforms running [proprietary] Guardian O/S, plus 5 prior years on another platform.
HP NonStop used to be "Tandem Computers, Inc" (before Compaq acquired them, and HP subsequently bought-up Compaq).
So yeah, I'm an old code-grinder & bit-twiddler going back to the early 80s.
And I'm sure that if anyone asked me a [legitimate] "how-to" question about Guardian O/S, my response would be similarly perceived in the fashion that I (and others) react to the Linux-expert responses. Embarassed
Everything is simple and "obvious" to the expert; but to the novice, we're still getting that deer-in-the-headlights feeling! Shocked


Yeah, I'm not sure the exact year this one person graduated, but from what a life long friend of his suggested, I'm gathering around the 1950s... She did mention around the time of the cold war, though he did comment once about JFK's assasination, when we were discussing once, and what he suggested then and there suggested he was a first hand eye witness. He certainly had a few things to say about what he saw. I'm also gathering he was a vet, at least he did go to the VA hospital.... He didn't believe the Warren Commissions account though, with a "I know what I saw!"

Apperently however he was working on something which involved mapping of stars, prior to the advent of the PC, back when peeps were still coding everything in Fortran and what not; and faced with the problem that hte computers then weren't very fast, came up with what would have essentially amounted to an early optical computer (we'd refer to it as such today). The thinking was, why not use the light captured from the stars, to help calculate the calculations they were trying to run, where conventional computers of the time just didn't have the speed to compute it all, very quickly... But when he was done, he threw it up on the shelf, didn't give it another though; and decades latter his friend was like "yeah optical computers, he never did think to patent that back then" #ni-1

Quote:
Nuadormrac wrote:
: : :
If enough electrons don't remain in the higher energy state, what should be a 1, could be mistaken for a 0, hence a bit error that things like ECC could be designed to help detect. With that understanding it makes sense; remove it, and a mechanism by which things can go wrong or fall into error, could be "deemed a mystery".
: : :
This is why you have systems that use a SECDED method for memory management (single error correction, double error detection) which mostly employs Hamming codes.
This method is sometimes used with 'registered' memory, and that's why it is typically slower but much more reliable (and expensive).

It comes down to a matter of speed vs reliability.
For my own use, I prefer speed (unbuffered memory); and besides, nothing that I do on my personal rigs is "mission critical".
But I would get to a level of verbal hostility towards anyone who would suggest using that same kind of unreliable memory for critical business purposes.


Yeah, or one can use ECC RAM. Back when I built my computers prior to my last one dieing (don't build laptops though, which I'm needing at present, and didn't have money for 2 systems atm) I tended to use ECC, albeit NOT registered memory. It could correct single bit errors, though I don't remember it having the speed loss of some solutions (aka registered).

But then the ECC DIMMs that I last got (my Athlon 64 would be kinda old today) was Corsair's XMS DDR memory which I clocked at 400 MHz. It had ECC, though it also had some of the lowest CAS latencies and what not, and being designed for overclocking, allowed me to put some of the most agressive timing settings on the thing without it batting an eye... But it was designed to overclock a fair bit beyond the standard settings (and not by small margin). ECC though, as I remember is not the same as the registered memory however, and didn't have the different pin count for the DIMMS, it did have an extra chip on it though, but was still the 184 pin DIMM (yeah I said it was many a year ago). The mobo I got paired them, for the dual channel memory bus though (which is a practice even with DDR 3 today, many manufacturers have kept up with).

Quote:
Nuadormrac wrote:
: : :
I think what can end up happening though, is you get a situation where the professors might deem something "beyond the students" and so chose not to teach. But then these students graduate, get into the work force, and unless they picked it up somewhere, that understanding might not have been passed on. It could, just perhaps have had an effect... When I confronted a few teachers, they acknowledged it (with me), but to the class in general, the much simpler model is what they would teach nonetheless....
: : :
"beyond the student" or "beyond the instructor"????
It's been my observation that university professors tend to become insular & out-of-touch with practical, real-world challenges/solutions after not too many years...especially susceptible are those with "tenure".
And then, they begin to try BS-ing their way around the more gifted/inquisitive students (BS does not refer to butterscotch in this context).
Prof. Michael Disney, Cardiff University wrote:
"The presumption of knowledge is the greatest impediment to [scientific] advancement."
Well, at least there is ONE professor who is aware of the insularity problem within academia!!!

Throughout my career, whenever an inexperienced college-grad is hired (into I/T), I make a point of telling him/her to forget 80% of what they learned in college.
"Theory is knowing how things ought to work; Reality is understanding how things actually work - and why", is my pontifical message to these ingenuous, addle-brained noobs.



Yeah, it's possible. In retrospect though, there were some things I learned from people outside the classroom. Though not everyone had run into some people who were working on this stuff, back when a lot of the technology (for instance that helped launch man into space) were inventing some of this stuff... In many ways, like the one guy I mentioned, he definitely went beyond what a lot of the professors were lecturing on....
_________________


.
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic   Reply to topic    KWSN Orbiting Fortress Forum Index -> The Bragging Yard All times are GMT - 5 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group
Optimized Seti@Home App | BOINC Stats