You could download Comma.ai’s new open-source Python code from Github, grab the necessary hardware, and follow the company’s instructions to add semi-autonomous capabilities to specific Acura and Honda model cars (with more vehicles to follow). Comma.ai’s CEO George Hotz told IEEE Spectrum last week that Comma.ai’s code has safety features, but what would happen if there’s a bug and your car crashes into a building, another car, or a pedestrian? Self-driving-cars are notoriously difficult to test for safety.

Hotz writes in an email, “It’s not my code, I did not release it”—Comma.ai Inc. “released and maintains it.” Most legal experts that spoke with IEEE Spectrum—and Hotz himself—believe that if you use the company’s code and something goes wrong, then it isn’t liable for damages. You are.

But Consumer Watchdog advocate John Simpson doesn’t believe this is fair. He says Hotz “was somewhat responsible” for any damage that could occur. Although responsibility gets “murkier” as more developers modify the code, he says Hotz made it public, and should therefore be held liable as well as the user.

The controversy exists in part because autonomous driving legislation is just starting to take shape around the world.…[Read more]