After a few days of testing code and watching my DRAM consumption over time I have found that for some reason the following code leaks

Code :
var BVvertexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, BVvertexBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW);
gl.vertexAttribPointer(shaderReference.vertexPositionAttribute, 3, gl.FLOAT, false, 0, 0);
 
//create the indices
var indices = [
	//top of the cube
	0, 1, 
	1, 3,
	2, 3,
	2, 0,
	4, 5, 
	5, 7,
	6, 7,
	6, 4,
	0, 4,
	1, 5,
	2, 6,
	3, 7
	]
 
	var BVindexBuffer = gl.createBuffer();
	gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, BVindexBuffer);
	gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(indices), gl.STATIC_DRAW);
	var BVindexBufferNumItems = indices.length;
 
	gl.drawElements(gl.LINES, BVindexBufferNumItems, gl.UNSIGNED_SHORT, 0);
 
	//delete the buffers to avoid a memory leak!
	gl.deleteBuffer(BVvertexBuffer);
	gl.deleteBuffer(BVindexBuffer);

The code is for rendering a box bounding volume. The leak is very very small. Most likely only a few bytes, if not a single byte or less. I call this function about 10 times per frame at 60fps and it takes 90-120min for the leak to get to about 1gb. If I dramatically increase the size of the buffers the speed of the leak does not increase, so clearly webGL is correctly deleting the buffers.

Am I making some stupid mistake or is there a tiny leak in webGL code? I'm using google chrome.