You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is not a big issue and probably doesn't need a fix, but I thought I would point it out. It's coercing the Float to a Double because the hardcoded values are seen as Doubles, but I feel like it should be flipped and instead infer the hardcoded values as Floats.
publicstaticfunc celsiusToFahrenheit(_ value :Float)->Float{letmultiplier:Float=1.8letadd:Float=32.0return multiplier * value + add
}
So I assume all "harcoded" Floats are seen as Doubles when it generates the code. Swift itself can have issues with this, but only when it infers the type on a variable. Hardcoded values in general get inferred to have the type of what's around it, but it is true that the default/inferred type for hardcoded values is a Double in Swift so I can see why it generates the way it does.
leta=32.0 // infers as a double
letb:Float=32.0letc= a + b // compile error due to a being a double
letd= b +32.0 // infers as a float
The only issue for me is I don't really need the extra precision and it's double the size (8 byte vs a 4 byte).
So again, no fix needed, but might be nice in the future to have hardcoded number values infer the type of surrounding variables.
The text was updated successfully, but these errors were encountered:
This is not a big issue and probably doesn't need a fix, but I thought I would point it out. It's coercing the Float to a Double because the hardcoded values are seen as Doubles, but I feel like it should be flipped and instead infer the hardcoded values as Floats.
This cito function
Translates to Swift with a type conversion to Double and then back to Float
Now if I specify the types and store the multiplier and addition in a variable it behaves
Output:
So I assume all "harcoded" Floats are seen as Doubles when it generates the code. Swift itself can have issues with this, but only when it infers the type on a variable. Hardcoded values in general get inferred to have the type of what's around it, but it is true that the default/inferred type for hardcoded values is a Double in Swift so I can see why it generates the way it does.
The only issue for me is I don't really need the extra precision and it's double the size (8 byte vs a 4 byte).
So again, no fix needed, but might be nice in the future to have hardcoded number values infer the type of surrounding variables.
The text was updated successfully, but these errors were encountered: