Choosing from CS electives

<p>Hello! </p>

<p>This is my first post, so I'll introduce myself. I'm an undergraduate student attending Gallaudet University. Recently changed my major to Computer Science. I'm thinking about becoming a software engineer / database designer. I'm a former Rochester Institute of Technology student and was an IT major, but didn't get too far before leaving school. I wasn't ready for college back then. But one thing I know for sure: I was likely born for computers! Even in my computer programming class where some students were bored to death, I was having the time of my life.</p>

<p>Now, I would like help with choosing from specific CS electives at Gallaudet. The list of courses required for a BS in Computer Science at that university is shown here: Math</a> and Computer Science - B.S. in Computer Science - Gallaudet University</p>

<p>The electives shown on that page are:</p>

<p>Choose two courses:
CSC 202 Assembly Language Programming (3)
CSC 305 Introduction to File Processing (3)
CSC 352 Computer Graphics (3)
CSC 401 Analysis of Algorithms (3)
CSC 402 Artificial Intelligence (3)
CSC 409 Parallel Processing (3)
CSC 495 Special Topics (3)</p>

<p>Keeping in mind only two courses are needed, my questions are:
- Which electives are in high demand/low supply? Do any of them teach skillsets that most employers are looking for that are in low supply?
- CSC 202 and CSC 305 are recommended prerequisites for some required courses. But would any of the others be better alternatives?</p>

<p>I'm providing the course descriptions so you can help me out with this. Your help is greatly appreciated.</p>

<p>COURSE DESCRIPTIONS</p>

<p>CSC 202 Assembly Language Programming (3)
This course will provide basic concepts of programming
systems, introduce computer architecture, and introduce an
assembly language.</p>

<p>CSC 305 Introduction to File Processing (3)
This course will introduce concepts and techniques of
structuring data on bulk storage devices, provide experience
in the use of bulk storage devices, and provide the foundation
for applications of data structures and file processing
techniques.</p>

<p>CSC 352 Computer Graphics (3)
Windowing environments and graphical user interfaces will
be discussed. Experience will be provided with programming
graphical interfaces. Transformations including windowing,
clipping, geometrics, and perspectives. Computer
graphics applications.</p>

<p>CSC 401 Analysis of Algorithms (3)
Fundamental data structures and algorithms are reviewed:
arrays, pointers, trees, and recursion. Sorting techniques
such as quicksort, radix sort, heapsort, and mergesort are
analyzed in relation to their computational complexity
and memory requirements. Searching methods, including
binary, balanced trees, hashing, radix, and external are analyzed
for computational complexity and memory requirements.
String processing, pattern matching, cryptology,
simple closed path, convex, hall, depth-first and breadth-first
searches, connectivity, polynomial, Gaussian, and curve fitting
algorithms will be applied to basic data sets.</p>

<p>CSC 402 Artificial Intelligence (3)
Artificial intelligence studies ways of making computers do
intelligent tasks. These tasks include playing games, expertly
solving problems, understanding natural language, and
proving theorems. The theoretical background of artificial
intelligence, artificial intelligence programming paradigms,
and some applications of artificial intelligence are introduced.</p>

<p>CSC 409 Parallel Processing (3)
Parallel processing systems and supercomputers. A combination
of theory and practice using supercomputers and
parallel processors available on the Internet. Emphasis on
parallel algorithms, parallel language constructs, message
passing libraries, and high-level tools for creating parallel
programs from serial programs.</p>

<p>CSC 495 Special Topics (1-3)
Advanced topics in computer science depending on the
needs and interests of the student.</p>

<p>If you want to do database design, it would be nice if there was a database course taken as an elective. If the CS department doesn’t have database courses, check the College of Business/Management Info Systems department to get a database course.</p>

<p>After taking an academic database course at your school then find some way to take a training course in either SQL Server or Oracle.</p>

<p>You STILL can take electives from that list. I would DEFINITELY take Analysis of Algorithms. The course just helps out your mind to be able to conduct problem-solving. I am not saying that you will do something at work exactly like the course, but it help gives you a “mindset” to attack programming and coding.</p>

<p>The 2nd course can from any of the others you listed.</p>

<p>Oh, and more thing about the Analysis of Algorithms course. If you ever attend grad school for computer science, that course will either be required for admission and/or be on the comprehensive exam for the M.S. degree.</p>

<p>202 is kind of a specialty these days, but you might enjoy it. Pretty low level.
Ditto 402 but at the high end. 409 even more so. Would be nice if they covered distributed computing. Cloud computing too.
305 seems very fundamental and probably easy. I’m surprised it is a course. So maybe it should be considered essential there.
401 does seem essential, but I noticed you also have required course 315 which I suggest should be taken first.
Impossible to tell what 495 is.
352 is another niche area, but again you might enjoy it. It probably has more application than 202, 402, or 409.
Yes, 407 will be an important course.</p>

<p>Overall, my opinion is that the program is mostly math oriented. Yet the foreign language recommendation leans it more to a BA than a real BS program.</p>

<p>Most telling is what’s not offered. Subjects like compiler/interpreter design, digital logic, network security, organizational IT, man-machine interface/human-computer interfaces, technical writing, and information theory that are staples of many CS programs. And Operating systems and computer architecture for example ought to be two classes, not just lumped into one.</p>

<p>You must emphatically take the algorithms course. For the rest, do whatever sounds cool or others recommend.</p>

<p>Son says take algorithms and AI but AI is only for fun.</p>

<p>You should see if you can find a database I course which should cover some relational theory and a fairly thorough tour of SQL (database query language). A second-semester course could cover more advanced topics. Graduate database courses focus more one aspects of engineering database management systems. Understanding how a database system is architected and built can add to your abilities in designing database solutions.</p>

<p>You could also just read through Readings in Database Systems by Stonebraker and Hellerstein along with the papers recommended near the start of the book.</p>

<p>I think I’ve had 20 people tell me a class in assembly is the most awful class I could ever take.</p>

<p>I second (third? fourth?) algorithms.
Think about Parallel computing too – it’s a niche job-wise, but quite in-demand, and growing quickly. Databases is one of the most practical courses if that’s ever an option.</p>

<p>Don’t take assembly unless you plan to work for a microchips manufacturer or on operating systems (probably wont) or have a very deep passion for how a computer chip processes things.</p>

<p>GLOBALTRAVELER, great input! CSC 407 Database Design is required for Computer Science majors at Gallaudet. [Math</a> and Computer Science - B.S. in Computer Science - Gallaudet University](<a href=“http://mathcs.gallaudet.edu/computerscience.xml]Math”>http://mathcs.gallaudet.edu/computerscience.xml). My school has no Business (other than Business Administration) major, but the Computer Information Systems (CIS) major offers two database courses. They are CIS 317 - Database Design and Implementation (which teaches use of relational DBMSs using SQL) AND CIS 405 - Advanced Database Concepts and Applications. </p>

<p>I’ve sent an email to the CSC department chair and another Computer Science professor to see whether both CIS courses teach materials not covered in CSC 407. Checking on how much SQL is taught too. Do you or anyone know what’s the best way to find a training course in SQL Server or Oracle if my school doesn’t offer one? Should I apply to another school to take a training course and become certified? Or can I teach myself online or from a book, then take an examination to get a certification? </p>

<p>mrego, great observations! I will check with the Computer Science department on these things. My school isn’t a technical institution, so I’m considering consortium schools. It really helps to know what’s missing. Will the absence of these courses hurt my chances of being hired in general? Are the BS requirements sufficient (as a starting block) to meet what employers are looking for?</p>

<p>BCEagle, I’ll check with the Computer Science department about whether the CIS 317 and CIS 405 courses are like that. And check into that book. As a supplement, do you or anyone else know of any (free if possible) online tutorial programs for learning that kind of material?</p>

<p>QwertyKey, when they used the word “awful” did they mean a class in assembly is boring, insignificant, or very hard?</p>

<p>I’m definitely taking Analysis of Algorithms then. Regarding Artifical Intelligence, I have read elsewhere that there’s not enough demand for that. As for Parallel Processing, what about duo and quad processors? Perhaps understanding that stuff will be invaluable for database construction in future computers? Sorry too many questions this time.</p>

<p>Many universities offer certificate programs in SQL Server or
Oracle. These are usually taught by some arm of the school that does
part-time classes and the might not be covered by your undergraduate
tuition. You can teach yourself from a book but it takes some effort.</p>

<p>I believe that you can download the database engines from their respective
vendors and play around with them - as long as you don’t use them for
commercial purposes. You can also download and install MySQL which is
an open source database that’s used commercially (a lot - most online
forums are run on MySQL). MySQL is currently owned by Oracle - recently
acquired when they bought Sun Microsystems.</p>

<p>As far as books go, you might ask the professors what they use to teach
the courses at your school. I had a look at the text that they used for
my son’s course and I don’t know that I would recommend it. I took the
course on Date’s classic but it is somewhat outdated. I’ve heard good
things about the text by Raghu Ramikrishnan. One of my coworkers was
his roommate at UT Austin and says the guy is very good. You could
check the Amazon reviews for textbooks.</p>

<p>Many schools, especially those with engineering school, teach several
hardware-related courses such as logic design, computer organization
and computer architecture. In some courses, architecture includes
assembly programming. I think that you can become a better programmer
with knowledge of the hardware and how it works. I think that hardware
exposure is a must for software engineers. Assembler is directly
useful for reading crash dumps, developing high-performance algorithm
implementations, and debugging. If you’re a pure software guy, the
hardware will seem like magic - it’s nice to have the understanding
of that magic.</p>

<p>Assembly is usually hard or tedious. Machine instructions are typically
very simple. When you code in a higher-level-language such as C++, the
compiler generates long sequences of assembler code for your and builds
an object file of these instructions which then get linked to produce
an executable image. In an assembler class, you have to write thise
low-level instructions yourself. Memory allocation, input/output and
a lot of other programming constructs are more rudimentary. You have
to work with registers, stacks, etc. which are part of the programming
paradigm - only they are largely hidden and taken care of for you by
compilers and linkers. It’s another case of understanding what is under
the hood.</p>

<p>Artificial Intelligence was going to take over the world in the 1980s
and 1990s but it never lived up to the hype. The term took on somewhat
of a slightly negative connotation. AI concepts are used in a lot of
products today though companies might not use the AI label in their
product. Those concepts may go by other names today.</p>

<p>The most obvious use of assembly is for people who write compilers. Assembly gets you one step closer to the hardware than C. You certainly can’t have a computer architecture class without discussing the ISA. Nonetheless, assembly programs, viewed by themselves can be a far leap from what the hardware is actually doing.</p>

<p>^ LOL, yeah. People still in AI insist on calling it “machine learning” now because AI has such a bad stink on it after years of not living up to unrealistic expectations. It’s a fascinating field but it’s one that you’re almost ashamed to have your name associated with.</p>

<p>There are some people in academia who think that budding software developers will spend the majority of their careers on parallelizing legacy code and developing new code for parallel systems. We’re really at quite an interesting junction in the short history of CS… Moore’s law has already flattened out and it’s clear that companies are investing in multi-core systems, as you have already pointed out. Coupled with the resurgence of vector architectures (GPUs) there is a lot of potential benefit in studying non-Von computing.</p>

<br>

<br>

<p>In the old days, you had lots of computer vendors with their own
hardware architectures, operating systems and compilers. It made for a
lot of software engineering jobs. On x86, there are now three main
compilers: Microsoft, Gnu and Intel. There are, of course, smaller
players like Pathscale too. Back in the 1990s I worked at one of these
hardware companies and we moved to using a single compiler backend to
reduce engineering costs. So the individual language compilers
generated tuples (intermediate code) that was fed to backends to
generate actual code. There were multiple backend code generators so
that you could have compilers for different languages go to multiple
backends for code gen so that you could create m x n compiler
solutions. I believe that the backend stuff eventually got sold off to
Intel.</p>

<br>

<br>

<p>You can certainly get a good view of the hardware (or at least the
process in the hardware environment) with a debugger session that
displays memory, the stack, registers, code, etc. You can also get
a good view if you have to do a programming assignment in machine
code (a good architecture course should require this).</p>

<br>

<br>

<p>Parallelizing code well is quite difficult for applications that don’t
lend themselves to easy parallelization. Server-type operations (database,
web page, etc.) where processes are independent and units of work are
small are fairly easy to get bang for the buck on multiple-CPU systems.
Applications where there are a lot of dependencies or where subsystems
aren’t easy to partition present more of a problem. Some applications
may only lend themselves to be partitioned in a limited number of ways.</p>

<p>The compiler guys are working on auto-parallelization but it is a
difficult problem. Better language semantics are needed to provide
information to the compiler on when code can be parallized.</p>

<br>

<br>

<p>x86 chips have vector engines in them but they are pretty weak. Intel
will have 16 256-bit vectors in x86 chips in a year or two so vector
work will be possible, even without help from the GPU.</p>

<p>The current set of major compilers already offer autovectorization
but large performance gains are mainly had through hand-written
vector code or the use of performance libraries.</p>

<p>Back to the original post: you can perhaps see that there are several
areas of interest in computer science and that getting deeper into
an area often requires understanding of certain fundamental subjects
like hardware.</p>

<p>1) Your Intel and AMD processors execute machine instructions out of order.
2) IA-64 assembly instructions are converted at run-time into uops that the hardware actually runs. Therefore, the published assembly language is not the true ISA of the processor.
3. Being able to write a program with loads and stores is probably far less useful than understanding the memory architecture of the system.
4. A class in assembly could be fun, but won’t help you in a job interview.</p>

<p>1) yes.
2) The EPIC approach has been an interesting 20 year experiment, and yes, it’s like flying a plane - too complex for humans to do. It seems the x86 and IBM are winning the war at the high end.
3) Writing code using instructions that compilers don’t support can be useful too.
4) It depends on the job that you are interviewing for.</p>

<p>^ I am of the opinion that every CS major <em>should</em> know a little bit about assembly language and computer organization/architecture… but I am also of the opinion that one needn’t know <em>so</em> much about it, unless it is of particular interest. Personally, I do find it interesting and have taken some coursework in it, but there are certainly other things one could study instead. Parallel computing, AI, Graphics… these are all good things to study and probably have better profit margins for a CS student than more than basic organization/architecture.</p>

<p>Actually, I’ll recommend you talk to some professors about “special topics” courses. These might be courses that a whole class of students can take, courses dealing with very modern or interesting areas of CS. It could be a design project or perhaps even an independent study / undergraduate research course. As somebody who invested heavily in stuff like that, I can say that it pays off in the end if you can find something you want to do. Indeed, if graduate school is on the horizon, this is arguably the most important kind of class you can take (well, besides algorithms ;D).</p>

<p>The EPIC approach used in the Itanium failed miserably and lost out to dynamic hardware scheduling.</p>

<p>I think that there are so many interesting areas that doing a decent job in CS requires a masters degree or an extra year of courses (possible with a lot of AP or dual-enrollment courses).</p>

<p>There were may reasons for the demise of EPIC and I haven’t heard of dynamic hardware scheduling being one of them. I have an EE friend that writes extensively on Itanium and attends the relevant industry conferences and he wrote up a small summary of why it hasn’t done as well as it could have against IBM. Mainframe architectures (Alpha, SPARC, Itanium, etc.) have differences with commodity processors that deal with various problems that chips have. Nehalem-EX apparently is the first x86 processor from Intel that deals with those problems in a way similar to the mainframe architectures.</p>

<p>^ And careful about hijacking threads, BCEagle91 and Timmy2.</p>

<p>The Computer Science professor replied and told me that I needed to contact another professor that teaches database courses, so I’ve already emailed that professor. Waiting on her reply now.</p>

<p>I’m not sure that my university offers a certificate program in SQL Server or Oracle. That’s one of the questions I’m awaiting a reply on from the other professor. Also, I looked online and found some free SQL Tutorials. Based on individual experience, what’s the best online SQL or Oracle tutorial website? Is knowing only MySQL sufficient or is it important to also know PostgreSQL, SQLite, MSSQL, Sybase, etc?</p>

<p>I’ll look at Amazon reviews of different books. Anyone else know a good starting book to learn about databases, SQL, or Oracle?</p>

<p>I agree that knowing how hardware and software interact helps make more sense out of things. BCEagle91 points out that “Assembly gets you one step closer to the hardware than C” even though they may be a far leap from what the hardware is doing. Is debugging crucial to knowing, especially in interpreting errors between hardware and software? What courses teach about debugging sessions? Is this important for database careers?</p>

<p>Sounds good about special topics! I’ll check in with the Computer Science department on this. There’s a lot to learn. :)</p>