This is driving me slightly crazy. I have a hierarchy type custom setting which has a currency field.
Inside a trigger I do the following:
Global_Settings__c gs = Global_Settings__c.getOrgDefaults();
if(gs.Id == null)
{
gs.Minimum_Amount__c = 0;
insert gs;
}
Double minimum = (Double) Global_Settings__c.getOrgDefaults().Minimum_Amount__c;
Pretty straight forward and works exactly as it should. Where it breaks down is when the trigger is fired from a test method. For some reason the last line is causing an internal server error, but I can't see any reason why it should do so (this is the same whether I insert a custom setting record in the test method body or not).
Major Update - New Question Below
So I took the red pill, carried on down the rabbit hole, and it turns out it's pretty deep. What I've managed to establish through the debug logs is that there's a problem comparing the value of the currency field in the settings with another.
Changing the cast from a double to a decimal in the original code doens't cause a problem, however, comparing that decimal to anything else does. Using this code to setup some values:
Global_Settings__c gs = Global_Settings__c.getOrgDefaults();
if(gs == null)
{
gs = new Global_Settings__c();
gs.SetupOwnerId = UserInfo.GetOrganizationId();
}
gs.Minimum_Amount__c = 200.0;
upsert gs;
Decimal minimum = Global_Settings__c.GetOrgDefaults().Minimum_Amount__c;
Opportunity op = new Opportunity(Amount = 100);
Decimal a = Decimal.valueOf('' + minimum);
I then tested various options using these debug statements:
System.Debug(op.Amount > a); // All good, outputs "false" as expected
System.Debug(op.Amount > minimum); // Blows up
System.Debug('POW!'); // Never run
Even creating a new Decimal variable and assigning minimum to it doesn't work, though perhaps that's assigning a reference (I don't believe it is but this behaviour would indicated otherwise). So I can use the conversion to a String and back again for now... but the question is why is this causing an issue?
Minimum_Amount__c. Could you try with @SeeAllData=true and an existing valid currency value (I.e. don't set the test value in code). At a guess I suspect it isn't being stored as a valid decimal.Also, what are the length and decimal places on your currency? You aren't defining a value out of range by chance?
– Daniel Ballinger Sep 17 '13 at 02:41Currency(16, 2). Working in this org again today so will try with existing data now. – Matt Lacey Sep 17 '13 at 03:22