For me they weren’t allowed in Calc I, II, III, Alg I, II and Differential equations. Every other class pretty much required it.
if there’s one thing language models suck at, it’s doing basic math.
If you’re using a GPT 3.5 turbo level models, sure. Synthetic data is perfect for teaching LLMs, o1 will be good enough up to Calc III IMO, maybe even better.
The only thing I don’t like about this is that it uses a TI, yikes.
LLMs do suck at math, if you look into it, the o1 models actually escape the LLM output and write a python function to calculate the output, I’ve been able to break their math functions by asking for functions that use math not in the standard Python library.
I know someone also wrote a wolfram integration to help solve LLMs math problems.
For me they weren’t allowed in Calc I, II, III, Alg I, II and Differential equations. Every other class pretty much required it.
If you’re using a GPT 3.5 turbo level models, sure. Synthetic data is perfect for teaching LLMs, o1 will be good enough up to Calc III IMO, maybe even better.
The only thing I don’t like about this is that it uses a TI, yikes.
LLMs do suck at math, if you look into it, the o1 models actually escape the LLM output and write a python function to calculate the output, I’ve been able to break their math functions by asking for functions that use math not in the standard Python library.
I know someone also wrote a wolfram integration to help solve LLMs math problems.
They let us use them for all my college math classes.
They really don’t help much at all if you don’t understand the math, and if you do understand, you don’t need the calculator most of the time.