No chatbot should help people build a bomb, but jailbreaking techniques can trick them into going too far.