“The beginning of the list is UmassAmherst, Virginia Tech, possibly BU. But it will all come down to finances in the end, just like undergrad. She is open to anyplace, literally anyplace, but would like to live in a city if possible. Of course, UMA and VT don’t fit that bill. She will be taking the GRE exams in the fall.”
I think the advice given around state flagships for a masters of Phd in CS is a good one, they will have research in many fields and maybe easier to get in. Not the ones you mention though, big ten as someone posted, Ga Tech, Washington. In the bay area, schools like NC State, SJSU and Santa Clara (private) are also recruited at.
Q. Aren’t most jobs posted with a wish list of “requirements” that are very unlikely to be met by any applicant"
A. Not ones that I write - and the ones that do this are easy to spot. LIke requires 10 years in Data Science. Consider that to be a red flag to start in that the HR team or hiring team don’t have a clue.
Q. “don’t most job seekers realize that and apply to all jobs that they meet even the smallest subset of “requirements” for?”
A. Candidates that take this approach and then complain that they don’t hear from the 200, 300 companies they submitted their resumes to baffle me. I suggest taking an 80/20 or 85/15 approach to job posts. If you can’t honestly/impartially reach an 80-85% threshold of meeting the job spec, applying out of the blue won’t be that successful.
If you know someone in the company that can give you a first-person referral, then your results should be much better.
@theloniusmonk Is there an advantage to applying to one’s own state flagship? In this case, Umass Amherst? I believe they are highly rated in CS and D was accepted into their honors program and BioTap when applying undergrad but chose elsewhere, I assume that wouldn’t matter?
IME a quick look at the position and what the company does should allow you to pretty quickly narrow in on the core skill set that’s required. For example for a company that sells big data analytic software, maybe the “prior DBA experience” is a soft requirement. But “expert knowledge of hive and hbase” are probably mandatory. Most technical job postings I’ve seen haven’t had a lot of fluff, and break out the requirements into “required” and “nice to have”.
When I recruit I generally look for people who may have aptitude to learn, may it be business or technical. As mentioned above, an unicorn is someone who has both. At the same time I am seeing more data transformation and discovery tools for data scientists to use.
@simba9 sees DataScience/AI/ML as a fad while people like Elon Musk and (the late) Stephen Hawking see them as an existential long term threat to humanity. What do people like Musk and Hawking see that others miss? AL/ML/DS is and will be changing the way we live and work for the foreseeable future. Mostly for the good but in some ways for the bad (think job displacement). Many are now calling AI the “new electricity” and I agree. The next several years will be interesting. So, while I may agree that AI/ML seems be reaching the Peak of Inflated Expectations on Gartner’s Hype Cycle the rewards that they will bring will be nothing short of revolutionary. So no, I don’t thinks its a “fad”.
My husband was granted his first patent in Data Science over 20 years ago. Many of the recent new hires have a Master’s or PhD in statistics. They also hire engineers with a Bachelor’s degree. I agree with @oldfort that the aptitude to learn is very important especially in our fast-paced world.
Yes. Aptitude is so important. I know my son, a Data Scientist, is currently one of only two employees in his group without a Master’s or PhD. He has a tremendous aptitude to learn, is great at self-teaching, and is a tremendous people person, which works in his favor, as well.
I do think people are expecting too much from it. That’s almost always what happens in CS. I’m old enough to remember the first AI hype from the 70’s, and that stalled out. Twenty-five years ago the expectation was there wouldn’t be programmers by now because computers would be smart enough to write their own programs. Or robots would be doing all our work and our biggest problem would be figuring out what to do with all our leisure time. It was only a few months ago that Bitcoin was supposed to take over the world, and that’s already fading away.
The difference now is that HW capabilities have reached the point where much of the hype that was envisioned and worked on years ago can now be realized. For example, neural networks have been around for a while, but they are now commonly used in many AI applications due to the availability of relatively inexpensive GPUs that exist due to the gamer push for better graphics (another fad). But training neural networks, even with GPUs, still takes quite a bit of time. Today’s CS students are in a different world - exciting time for them.
While the capabilities of AI have gotten better due to increased processing power, storage availability, and available datasets, the AI methods themselves haven’t really changed, with the most recent actual advance coming decades ago. Everything in AI/ML these days seems to be combining the methods we currently have, finding the right magic numbers to tune your method with, and then spitting out results that look good but we can’t understand. In order for any of the AI worries Musk or Hawking has/had to be relevant, we would need something close to general AI. We don’t have anything there right now, we just have vaguely biologically inspired algorithms with virtually no link to actual “thinking”.
That doesn’t invalidate the very positive results that many AI applications of today can achieve, but I don’t think we’re going to see any magic advancement or leap from the current AI methods. As CS incorporates AI into the general curriculum more, the salaries will slowly settle down back towards the rest of CS. I don’t think AI is the “new electricity” as much as an additionally available tool of CS, to be used accordingly (good fit in some cases, not in others).
Does anyone think the fall out from the Facebook/Cambridge Analytica/Aggregate IQ scandal is going to have an impact on privacy laws and potential restrictions placed on what companies will be able to do with all this data?
Does anyone worry about the ethics involved in the career paths their children are considering pursuing? I never used to, but as DS’19 is approaching applying to universities it’s starting to weigh more on my mind.
“@theloniusmonk Is there an advantage to applying to one’s own state flagship? In this case, Umass Amherst? I believe they are highly rated in CS and D was accepted into their honors program and BioTap when applying undergrad but chose elsewhere, I assume that wouldn’t matter?”
I don’t know if in-state matters much, if at all, in graduate admission, but it does for tuition so yes there’s an advantage there. You probably want to contact the grad admissions office for CS and ask about preference for in-state applicants.
So true. 30 years ago I took a college class on machine learning systems. Last year I was working through an exercise to learn a new machine learning program and was surprised (well not really) to see that all the basic underlying principles were the same.
GDPR is already going into effect in Europe shortly so we will see how that goes, but I highly doubt the US will ever get to passing a law close to that even. The reality is that Facebook was a huge ethical data problem long before this scandal, people only cared once it affected US politics. Many other companies have similar data problems to Facebook but are not quite as bad so they are missing the public scrutiny. Even if higher regulations hit (which should absolutely happen), CS should be absolutely fine.
That said, CS absolutely has a lot higher ethical implications than most consider (such as all the Facebook employees who knew about their data practices long before it was public knowledge, Uber’s engineers, etc) and frankly is just like most other fields with high salaries outside of medicine: all about the money not the effects of the job. Many in CS try to duck the question by claiming they’re just the technical side, which I find unconvincing to say the least.
There is always danger in pursuing ‘hot’ and hyped up fields, because often those fields fade away over time as the market changes.
A traditional degree in stats or computer programming is safer and can give you more in-depth skills than a new data science degree. A lot of this stuff is going to be automated by artificial intelligence anyay, which of course computer programmers with computer science backgrounds and computer engineers are going to have a leg up on.
Wow. So strange to reread this thread from many years ago. Most posters I still recognize. I hope sjordorlo’s son is still doing well. I wonder if the peoples opinions have been altered in the past 4 years.
Data science is still a “hot” major. It’s so hot, in fact, that some colleges created this “hot” major mostly in name only without a solid foundation or much distinction from a similar CS program/track. There’s much to be learned from data and there’re a wide range of applications. However, there’s a key distinction between data science and more advanced AI. Data aren’t going to generate any real or artificial “intelligence”, other than the ability to identify, or learn to identify, “things” and patterns probablistically.
Parallelism embedded in GPUs and expansion in data storage help feed the explosive growth in machine learning and analyses of massive quantities of data. That quantitative growth will continue. However, IMO, a qualitative breakthrough isn’t possible until quantum computing becomes a reality. The real world is quantum and human creations to simulate the real world also needs to be quantum (linear parallelism isn’t sufficient). Unfortunately for many, including many people who are trained in CS, quantum is so alien, so counter-intuitive and so different that the skills they learned are likely to be nearly useless in a quantum world.