Scaling/Transformation
To directly answer your two main questions here: the answer is no to both.
First off, there is pretty much no defensible reason to scale or transform your variables here. Even if you were to include some interaction term here, you could simply add tensor product splines to your model and avoid any transformation/scaling that would necessitate it in other regression interactions. The only real thing you should be concerned with in that respect is having the appropriate regression link function. Unfortunately, without knowing more about your data, it is hard to say. If you have a normally distributed outcome variable, then there is no need to change this, and it looks like you have already explicitly modeled it as such anyway with family = gaussian(). So just leave your predictors in without scaling/transformation. I don't know if the scale and control arguments have been expressed here for that purpose, but I would just remove those terms. I also don't know what cluster is being used here for.
As an example of why this is not helpful, I have fit two models with the MASS package's mcycle data, one scaled and one not scaled. I have then plotted them thereafter.
#### Load Libraries ####
library(MASS)
library(mgcv)
library(tidyverse)
Scale Data
data <- mcycle %>%
mutate(scale_times = scale(times))
Fit Unscaled Model
fit.reg <- gam(
accel
~ s(times, bs = "cr"),
data = data,
method = "REML"
)
Fit Scaled Model
fit.scale <- gam(
accel
~ s(scale_times, bs = "cr"),
data = data,
method = "REML"
)
Plot Them
par(mfrow = (c(1,2)))
plot(fit.reg)
plot(fit.scale)
You can probably see an issue already:

This mcycle data is typically used for GAMs, and shows time in milliseconds after a crash plotted against head acceleration in g's. The plot to the left is the data in raw form. With this, we can predict or at least guess where the crash test dummy's head should be located after specific times (for example, around 15 milliseconds we expect somebody's head to jerk in one direction, then whiplash after about 20 milliseconds.) The plot to the right (the scaled version), has completely lost interpretability. Now we have z-scores on the x-axis and we cannot ascertain what this actually means in terms of head acceleration. What does a z-score of zero mean here? Very little if anything.
Of course, what scaling you use may solve different problems (maybe you are converting hours to minutes or something similar). But as it stands you really shouldn't adjust this in your data. Other transformations would likely yield equally dubious outcomes.
Spline Terms
I'm also not sure what you are doing with your splines. In your question you say they are cubic regression splines, but it is not specified as so based off your code. You are using a thin plate regression spline called ts, but personally I would recommend against using this in a bam model. You are likely using bam for speed of estimation (due to a large number of observations, etc.), and you are already using discrete to increase the speed of this optimization. Thin plate regression splines tend to increase the amount of time it takes to estimate and likely unnecessarily so because they estimate as many unknown parameters as there are unique combinations of covariate values in a data set (see Simpson, 2018). Since you are using single predictor terms here, you could consider instead using something less computationally demanding like a cr cubic regression spline instead. It may be helpful to know why gamma is being used too if you are including thin plate regression splines. Gamma is usually adjusted to make your model more smooth.
Knots
You have set k to k here (I'm assuming you saved this elsewhere), but generically adding this number of knots to all your predictors may not be wise. Check to see what your data actually looks like and see if it needs a higher degree of specificity. If for example there are several curves in your model and you have k=5, your regression will miss a lot of this curvilinearity.
mgcv? It would also help to know more about the specfic variables you're using. – COOLSerdash Jan 30 '23 at 07:57