[En-Nut-Discussion] Heap, Multi-Thread HTTP application, and memory Loss
Eric Haver
havereric1 at gmail.com
Thu Oct 25 06:46:26 CEST 2007
Hi,
I am using NutOS 4.4.0 w/ YAGARTO arm-elf/gcc 4.1.1 on the SAM7-EX256.
The httpserv.c demo file opens 4 threads for the server. The demo is a
straight forward HTML file, it works well.
My project requires running a javascript file that, in turn, loads other
files(.xml and .jpg) into the dynamic page divisions.
The problem is the loss of free memory.
Using *NutDumpHeap(stdout)*, I see the number of free blocks in the heap
increases after the first call to the page, and then, after each call, the
large free block at the end shrinks.
I scoured my threads to make them re-entrant, and even wrapped *NutEnter*/*
ExitCritical*() around the use of the global variables *heapFreeList
*and *available
*in *NutHeapAlloc*() and *NutHeapFree*().
So I limited the number of threads to one and it still occurs, just the page
loads very slowly.
In heap.c I saw the caveat:
*
* \note Do not use this function in interrupt routines.
*
Isn't a thread, by definition, an interrupt routine? Even the threads *
emacrx* and *tcpsm* use *NutHeapAlloc()* and *NutHeapFree()*.
Any ideas? Anyone else having the same problem?
Eric Haver
HaverEric1 (at) gmail.com
More information about the En-Nut-Discussion
mailing list