I don't understand how to use bitflags (like in the GUI) within galaxy code.
Converting it to script (with the GUI), the type of my bitflag preset variable is integer. But using integers I can't use the given syntax for it (it says wrong type).
The syntax is the following:
MyBitflag = 0x00000001 | 0x00000002 | 0x00000004 | ... | 0x80000000; // for the different flags
if (MyBitflag == 0x00000001 | 0x00000008) {
DoSomething();
}
It's not hard to do that yourself, thought I prefer the version abouve.
My version would be:
ModI(Int/<2^(Flag-1)>,2) = 1 //== to check if a flag was set
Int = Int + <2^(Flag-1)> //== to set a flag
Converting it to script (with the GUI), the type of my bitflag preset variable is integer. But using integers I can't use the given syntax for it (it says wrong type).
The syntax is the following:
MyBitflag = 0x00000001 | 0x00000002 | 0x00000004 | ... | 0x80000000; // for the different flags
if (MyBitflag == 0x00000001 | 0x00000008) {
DoSomething();
}
It's not hard to do that yourself, thought I prefer the version abouve.
My version would be:
ModI(Int/<2^(Flag-1)>,2) = 1 //== to check if a flag was set
Int = Int + <2^(Flag-1)> //== to set a flag