I write because I am not clear on how fracture energy is defined.
I am using a very simple model simulating a uniaxial compression test on a cube. I define the material with the ASDConcrete3D bond (Preset Concrete 9P).
I enter, as required, the fracture energy, calculated according to literature formulas, with a value of 7 N/mm.
I use the tester to compare the curve and if I enter an lch equal to 100 mm (size of the cube with which I have to compare the results)in the tester I obtain a graph comparable with my experimental results.
When I run the analysis, simulating the compression test, if I use a mesh with a size (100mm) that is equal to the sample size, I get the same result as the tester. But if I change the mesh size (e.g. 5mm) I get values that are completely different.
In the following post (viewtopic.php?p=10032&hilit=fracture+energy#p10032) I read that the fracture energy is always assumed in F/L, thus the automatic regularization is ON by default.
So I don't understand why changing the mesh size the final result changes. I enclose a graph in which the dotted lines are my experimental results and the solid lines several attempts at numerical tests.

The curves are named to indicate both the fracture energy value in N/mm and the mesh size.
I don't think it's a problem with the model because I've tried it with others that work and the material seems mesh dependent.
Can you please help me clarify this doubt?
Thank you in advance for your availability.
Best regards,
Mari