• 0 Posts
  • 74 Comments
Joined 2 years ago
cake
Cake day: June 17th, 2023

help-circle


  • It’s not just random jitter, it also likely adds context, including the device you’re using, other recent queries, and your relative location (like what state you’re in).

    I don’t work for Google, but I am somewhat close to a major AI product, and it’s pretty much the industry standard to give some contextual info to the model in addition to your query. It’s also generally not “one model”, but a set of models run in sequence— with the LLM (think chatGPT) only employed at the end to generate a paragraph from a conclusion and evidence found by a previous model.


  • Relaying a key signal 20 ft when you know the key is there isn’t too tricky, like when you’re home. But I would propose that trying to relay a signal across hundreds of feet, like a busy mall or store, when you’re not even sure the owner is there is quite another thing. You can also require that the IR blaster is in the car before starting. There’s also a technology Google has been using for a while now where the device (car) would emit a constant ultrasonic signal for the other device (key) to pick up on to determine if they are close to each other. Something that could be done through clothing, but not easily relayed.




  • Autopilot maintains altitude and bearing between waypoints in the sky, and in some (ideal) situations can automatically land the aircraft. In terms of piloting an aircraft, it can handle the middle of the journey entirely autonomously, and even sometimes the end (landing).

    Autopilot (the Telsa feature) is not rated to drive the car autonomously, requires constant human supervision, and can automatically disengage at any time. Despite being sold as an “autonomous driver”, it cannot function as one, like autopilot on a plane can. It is clearly using the autopilot feature of an aircraft to imply that the car can pilot itself through at least the middle of the journey without direct supervision (which it can’t). That is misrepresentation.







  • I program professionally, and I copy paste all the time. The difference is when I copy paste, its 10-20 lines of code, not a line or two— and I’m not fishing for a solution to the problem. I already have the optimal solution in my head, and I am just searching for the solution I already know. It’s just faster than typing it by hand 🤷🏻




  • Sure, but a TV ad takes (at the least) and editor, or (at the most) a cast and crew. They take by money and time to create, and loop average working people into the process. Of course there will be people in any profession that will make whatever they’re paid to, but by and large, most of the acting/editing industry has some form of ethics.

    People debunking false claims takes time too, but since creating them take time as well, things have a chance to balance out (obviously that’s happening less, but there’s still a chance for it to happen). But if an AI model can pump out fake history autonomously, almost instantly, and without any chance for a human with ethics to intervene in the process, debunking/fighting misinformation becomes WAY harder. Because you’re not fighting a person with limited time and resources anymore, you’re fighting a firehose of false content that will bury you without even breaking a sweat.