Good work! Before you get started modeling, it's important to know that Spark only handles numeric data. That means all of the columns in your DataFrame must be either integers or decimals (called 'doubles' in Spark).
When we imported our data, we let Spark guess what kind of information each column held. Unfortunately, Spark doesn't always guess right and you can see that some of the columns in our DataFrame are strings containing numbers as opposed to actual numeric values.
To remedy this, you can use the
.cast() method in combination with the
It's important to note that
.cast() works on columns, while
.withColumn() works on DataFrames.
The only argument you need to pass to
.cast() is the kind of value you want to create, in string form. For example, to create integers, you'll pass the argument
"integer" and for decimal numbers you'll use
You can put this call to
.cast() inside a call to
.withColumn() to overwrite the already existing column, just like you did in the previous chapter!
What kind of data does Spark need for modeling?