I have this function in my C# app:
public static string SafeTrim(object str)
{
if ( str == null || str == DBNull.Value )
return string.Empty;
else
return str.ToString().Trim();
}
It works fine, but in my import utility it is called millions of times while processing hundreds of thousands of records. The ANTS profiler stated that this function consumes a lot of CPU cycles because it is called so often.
EDIT: I neglected to mention that a very common usage of
SafeTrim()in my app is forDataRow/DataColumnvalues. Example:SafeTrim(dt.Rows[0]["id"])- it's common for that to contain aDBNull.Value, and it's also common for that to contain edge spaces that need to be trimmed.
Can it be optimized in any way?
EDIT: I'll be trying these different approaches, under load, and reporting back tomorrow. Thanks all!