Hi,
Can an expert let me know if zero is a factor of itself.
If I am correct there are two ways to look at the definition of a factor and I am getting contradicting results by both (or I am terribly mistaken at the definition in itself - Either way, if the question sounds stupid, I apologize in advance

)
Definition 1 If a is a factor of b then b / a is an integer.
If I go by this definition, then 0 / 0 should be an integer, however anything divided by 0 is undefined.
Hence, my "logical" (or not so logical

) brain says that zero is NOT a factor of itself.
Definition 2 However, if I define factor as
If a is a factor of b then -
b= a * k (for some integer k)
By this definition, 0 seem to be a factor of itself.
In a nutshell, I am lost !
Can an expert help me out here and share his / her view on this.
Tagging few quant wizards to the thread, however anyone is welcome to share an opinion -
IanStewart chetan2u avigutman KarishmaB