I'm checking if Udemy or skill share have any courses on farming.time to pack bags and go farming
I'm checking if Udemy or skill share have any courses on farming.time to pack bags and go farming
And passively as wellAnd so it starts
IBM to Replace 7800 Jobs with AI, Announces Hiring Freeze: Report
IBM CEO Arvind Krishna has stated that the company is looking into replacing around 30 percent of the workforce with AI. Along with this, the CEO has shared that IBM would be pausing its hiring for a while amid mass layoffs and advancing technology. , Jobs News, Times Nowwww.timesnownews.com
This has got to be the most shady thing this company has pulled.
we are back in the rat race to secure the position and they will always pull such situation now AI is here to stay and progress!This has got to be the most shady thing this company has pulled.
A company needs 100 freshers.
A company hires 150 quality freshers with 5 lac package. Annual kharchapani 7.5 crore. Those quality freshers stop applying elsewhere.
A company slashes package by 50% and says the move actually benefits the new hires on 'aggregate' basis.
A company gets rets rejected by 50 freshers.
A company is now left with 100 quality freshers with 2.5 lac package. Annual kharchapani 2.5 crore.
A company profits 5 crore and the CFO takes home 1 crore in incentive.
View attachment 167116
That's ok but there is supposed to be a hive mind here and others can answer. They do not because they cannot.The field is changing so rapidly that it has become hard for me to keep up.
^THISU.S. military officials have expressed concerns in the past that AI and machine learning could lead to situations where there is just too much software code and other data to be truly confident that there is no chance of this sort of thing happening.
Yeah, this has been my experience too and a frustrating one at that. Interesting explanation.Machine learning is just making it do what the majority do. Example - Google keyboard. In the starting, it was really good at giving hints about possible words that one would type and really good at spelling. You could learn something from it if your English was not up to par. Now? I doubt it's anything like it's former self. All the words it suggests have spelling mistakes, grammar is inconsistent, suggestions are different from what you want, it even suggests different language in romanised form like hinglish. But you can't blame the language model behind the keyboard for this. If majority of people using it type in hinglish, it will think you want to type in hinglish. It doesn't know any better about culling off words unless you delete the suggestions yourself.
Does not look promising, does it? if it ends up that the lowest common denominator becomes the reference then where is the learning?By using AI we will only speed up this process of getting everyone at the same median iq level in all fields and that level will be near the bottom. If there are millions of idiots saying the wrong thing and just a few hundred people saying the right thing, guess what will become the normal.
You can blame Google for not keeping app up to date. Google assistant is also one such example. It used to be good but now it's not. TBH, many google apps have been turning bad because google has stopped innovating.Machine learning is just making it do what the majority do. Example - Google keyboard. In the starting, it was really good at giving hints about possible words that one would type and really good at spelling. You could learn something from it if your English was not up to par. Now? I doubt it's anything like it's former self. All the words it suggests have spelling mistakes, grammar is inconsistent, suggestions are different from what you want, it even suggests different language in romanised form like hinglish.
You are again comparing AI like GPT4 with AI of some drone.A drone is told it gets points for kills, but sometimes the operator refuses to give it the go-ahead. It concludes what? Kill the operator because he is interfering in the mission
Then they said ok, that is bad. Don't do it. It then tries to take out the communication tower the operator uses to control it. WTF !!
Well, if we are talking about autonomous systems then why is it different? Earlier you said I was not making the right comparison comparing with earlier gen expert systems.You are again comparing AI like GPT4 with AI of some drone.
Both are using same terminology, yes, but these things are quite different. These have nothing in common.
Which education system does this? Becoming an expert is experience based. It's up to you. If you do anything for 5-10 years you become one.Our education system doesn't produce many experts.
To learn something, it (the AI model) needs to think on its own. Grammarly probably has stricter standards and rules for what it's language model learns. Hence why they are doing a better job.Does not look promising, does it? if it ends up that the lowest common denominator becomes the reference then where is the learning?
Automated system going haywire isn't anything new. As you guys have mentioned it gboard is one such example.Well, if we are talking about autonomous systems then why is it different? Earlier you said I was not making the right comparison comparing with earlier gen expert systems.
This time I gave you an example of a system acting on its own. Isn't that what we're talking about here?
How a simple task or you would think isn't so.
I'm that guy.I remember asking a friend if he wanted to do an MBA, and he said why, the only purpose for MBA is to build a network. You don't actually learn to do or manage a business.
What else left in schools anyways? Periodic table gone, Darwin gone, mahatma Gandhi gone, Climate-ecosystem gone.Traditional school
Step into any Silicon Valley coffee shop and you can hear the same debate unfold: one person says that the new code is just code and that people are in charge, but another argues that anyone with this opinion just doesn’t get how profound the new tech is. The arguments aren’t entirely rational: when I ask my most fearful scientist friends to spell out how an A.I. apocalypse might happen, they often seize up from the paralysis that overtakes someone trying to conceive of infinity. They say things like “Accelerating progress will fly past us and we will not be able to conceive of what is happening.”
I don’t agree with this way of talking.
Still agree with this. The calculator replaced the slide rule, not the userJust like accountants started using calculators instead of calculators replacing them, no.
It is a tool, which will be utilized in the workflow.
(imo)
Exactly…AI should be thought of as a tool and not some creature
There's no such thing as original-orginal.He thinks of chatGPT as a mashup. Like music can be sampled from different sources and new music made. But AI cannot come up with original stuff on its own. All it can do is blend.