As long as it reads the same and the alloy works in the barrel and on the target do we care that it reads 2,3, even 5 points off in a particular range ? Not especially as long as it reads the same for every batch . We make up a batch that's 5 points harder or 5 points softer big deal as long as that's what we meant to do we know it's 1/3 harder or softer than the measured 15 bhn WW ......it'd be awkward if we had a certified dead soft at 10 but at least we would know our 17-18 could be used for HP or anything calling for a 20-1 alloy . We would also know that a 32-35 was closer to A/C lino . If it measures the same and it works the same it doesn't matter what it reads as long as it's the same .
Great post~! ^^^^ , and not just the part I quoted.
My "take", or "opinion" below, in agreement, but longer than yours as I tend to ramble...
(More so as I get older, trying to stuff in as much as I can before I am no longer on the "green side of the sod", it seems. Days are numbered, in my case. No one is forced to read it. )
I call that idea you are describing in the quote, particularly the
underlined, in the quote, as using a "Benchmark".
Once the "benchmark" is established & "you"(anyone)
use it consistently to measure similar parameters against that "benchmark" , you have established a "standard" that "YOU"(anyone) can use to go/measure from, even if "Your"(anyones) use of that "benchmark/standard" is not the same for others, IF, they have set their "own" benchmark /standard.
I understand that back in the day(
like in the Bible) there was a "measurement" called a "Cubit".
IIRC< it was the distance from a mans elbow elbow tip to the tip of the middle finger.
Now, the "average" person might have that be a measurement, in "inches" , as, lets say, 15 inches
(
< as an example, don't know the average nor am I gonna measure my own to find out.)
Regardless, it was a "benchmark/standard for the average person to use to measure things.
Just like an "Inch", IIRC, was decided to be the distance the sovereign Kings thumb, measured from the tip of the thumb to the first knuckle,/joint.
Once again, not everyone has the same size in that distance for the thumb, but it was used as a "Benchmark/standard" for that time...
I rambled...
I am only trying to help some folks who could be confused, to understand that it does not matter so much than one persons benchmark/standard is the same as anyone elses, but that the benchmark/standard that anyone uses ,
as long as the benchmark/standard is consistent & remains the same for All OTHER Measurements for comparison to the same thing one is measuring.
BHN/Hardness, be it measures with whatever tool ya use, be it a with a "Rockwell Scale", an LBT, a Lee, or a Cabinetree one, it really doesn't matter what, as long as it is used "Consistently", and using to measure the same items.
IOW, One can't use a "yardstick" in inches & try to say it is the same as a "meter" broken down into centimeters or millimeters. Unless one has a way to convert one into the other.
As an Example/I.E. -My 2 lee pots for melting & casting do not read the same on the dial to hold the same temp. . One might read 7 on a scale of 1-10 & the other, 7.5 or 8, but using the same Lyman digital thermometer(
that was checked for consistency using the boiling temp of water(212° F) and found to be off by 4°F low at 208°F, but that does not matter because of "consistency" in measuring as a "benchmark")
I can melt alloy, be it the same for both, or one pot as handgun BHN, and another for Rifle bullets with a higher BHN level, and even not be exactly the same as far as Temp for whatever particular alloy I am trying to use due to trying to get the different types of metals to "mix", but since I am using a "constant" or a "benchmark/standard" for the beginning & until the end & final product, it Doesn't Matter to me whether anyone else end result is the same as mine, as long as I know I was consistent & get the results I want, based on using the benchmarks/standards I started with & stayed with all thru the process.
I like to be consistent
, and I tend try to go with 1 BHN to a 100 fps. Meaning if I am pushing a cast bullet at 800 fps, I want at least a 8 BHN. Pushing at 1200 fps, I want to be at least 12 BHN or so, etc. Rifle...I want to push a bullet at 1500 fps, I want to be at least 15 BHN & so on.
As long as I am "consistent" in my measuring my alloy hardness, it seems to work fine, Plus or Minus, for "Me". .
HOw I do it is I make sure my method of testing hardness, in my case a Lee tester, I make sure I set it up as it is supposed to be & I check it against some pure lead that I KNOW is pure lead. The I test it against some monotype, some foundry type or linotype, knowing that the measurements I am getting should be within a reasonable close to what it should be.
IOW, I make sure I have a "Benchmark/Standard". Even if I am off a bit from what many folks use as a scale for hardness, I can extrapolate the results I get, simply on knowing what I want & how the scale worked out for me as I tested the hardness of the alloy(s).
(
Doesn't matter is I used a Lee, Cabintry, a LBT, some pencils, or a Rockwell, etc., I set a "Benchmark/standard to remain consistent & use the same all across to do the measuring)
Yeah, here I went again and rambled some more.
JUst trying to help out for those who might not "get" the idea about "parameters" & how to not worry about what everyone else is doing, but making sure one is consistent in what ones own self is doing.
Alloy, Lube, size of diameter in relationship to ones firearm, etc..
Doesn't matter, as long as one is consistent with ones own stuff.
Okay. I got this out of my system.
If ya read this far, Thanks~!,
if not and ya skipped over it... I don't care.
Not gonna edit. Tough it out. I did in typing it.