If you’d like to generate super large files for stress test purposes, you don’t have to waste time pumping data into a file to make it grow. Instead, simply set the desired file size to reserve the space on disk.
This creates a 1GB test file:
# create a test file
$path = "$env:temp\dummyFile.txt"
$file = [System.IO.File]::Create($path)
# set the file size (file uses random content)
# view file properties
ReTweet this Tip!
I'd imagine the clusters (allocation units) where the huge file was mapped to were zeroed out as a (not-fast) format would do. If the clusters had been previously used and released you'd find the remnants of the previous file there.
Every byte appears as "00" (NUL) in the file. I'm not sure why the code comment states it uses random content.
That is what this .Net namespace does. See details here:https://msdn.microsoft.com/en-us/library/system.io.file.create(v=vs.110).aspx
It's an odd thing, but doable. It's quick, no disk thrashing to create the file(s) and serves use case purpose.Now, if you are testing IOPS then no, you need the writes of a populated file creation.
I can image, other use cases for this approach as well.
How is that possible. - "a huge empty file" ?
It's just a reservation, so the later.
Does it actually write random data there or it's empty huge file?