I have been working hard with GNU Prolog, I now have a working JSON library, a working HTTP encode / decode library and a rough working HTTP web server all in lovely pure GNU Prolog but I keep hitting a problem now when I hammer it with a simple GET request from JMeter.
I also have a working RabbitMQ module and when trying to push one million messages, I had the same issue and had to ramp up the LOCALSZ value to get it to pass that particular test so I am obviously doing something *wrong* in all of my applications at the moment. I must have mis-understood the nature of tail-call optimisation in Prolog. I am a seasoned Erlang developer and I use Haskell, LISP and Scheme too so I understand the concepts!
Educate me with gprolog please!
Also I recently had a similar issue with my FastCGI project, which, whilst working, was subject to the same problem but now I can't avoid tackling it! I suspect it is because garbage collection is not happening because I have failed somehow to make my code tail-call friendly or done something else that has meant I am hanging on to memory I don't need but I am confused as to why and where it is happening.
I plan to publish my HTTP libraries and server when they are done, and the RabbitMQ module too, so any help would be greatly appreciated at this juncture.
Apologies in advance for the length of this email. I will list only the minimal amount of code and try to explain how I *think* it is working.
==> Server Init
http_server_start(Port, Callback) :-
logger_info("http_server_start: PID: ~w", [Pid]),
The server loop is the first tail-call predicate... it waits for a connection, handles it (a simple echo) and then goes around again... I capture the LOCALSZ size and it is using a MASSIZE -177480- bytes per page request, no wonder it eventually runs out of memory. ARGH!
==> Server Loop
http_server_loop(S, Callback) :-
socket_accept(S, _Client, In, Out),
logger_info("Accepted connection: ~w", [_Client]),
Handler =.. [Callback, Out, Request],
I was under the impression that *any memory allocated* during the call to the "Handler" is released when it returns, after all, there are no parameters passed in other than the HTTP request and the output stream back to the browser client. *possible screw up alert*... when I call http_server_loop(S, Callback) in tail-call position I have just realised that I have (?) created a choice point?, causing all memory used to be stashed until a return, which is not going to happen?.. Ah ha! Is that the cause of it then? Or is it as simple as placing a cut after the handler has returned? I will try both!! Excited now!!!
I have read stuff about this but it's all a bit vague to me at the moment... how do I tell it that I don't want to keep anything... I think it has to do with something like forcing a fail or something? Do I need to refactor the memory hungry part and then do it, fail, catch the fail (after the garbage collect?) and then tail-call my way around again?
The default callback predicate is a simple HTTP request echo that generates a quick and dirty HTML page that shows the various parts of the request, along with the LOCALSZ statistics for tracing its progress down to zero! Currently that predicate looks like this:
==> Echo Handler
http_server_echo(Out, (Method, Url, Query, Headers, Body)) :-
logger_info("(callback) http_echo: url: ~s", [Url]),
fmap(http_encode_header, Headers, HeadersOut),
join(HeadersOut, "<br/>", HeadersStr),
statistics(local_stack, [UsedSize, FreeSize]),
format_to_codes(LocalStack, "Used/Free: ~d,~d", [UsedSize, FreeSize]),
"<tr><td width='150px'>Method</td><td>", MethodStr, "</td></tr>",
"<tr><td>Headers</td><td>", HeadersStr, "</td></tr>",
"<tr><td>Url</td><td>", Url, "</td></tr>",
"<tr><td>Query</td><td>", "-query-string-", "</td></tr>",
"<tr><td>Local Stack</td><td>", LocalStack, "</td></tr>",
"<tr><td>Body</td><td>", Body, "</td></tr>",
The final part of the equation is actually receiving the HTTP request, that uses a set of DCG rules I've created to parse an HTTP request according to the latest RFC documents:
==> HTTP Request
http_server_get_request(In, (Method, Url, Query, Headers, Body)) :-
http_server_untilcrlf(In, , ReqData),
phrase(http_request_line(Method, Url, Query, '1.1'), ReqData, Buf1),
http_server_untilcrlf(In, Buf1, HdrData),
phrase(http_headers(Headers), HdrData, PreBody),
list_find_def('Content-Length', Headers, Length, 0),
http_server_recv_body(In, Length, PreBody, Body).
http_server_get_request(_, ('FAIL', , , , )).
OK, thanks for reading this far... without blasting you will all the code, is there anything in my style so far that would make a professional Prolog hacker slap his head and go "No no no no no! That's not tail call optimised", or for that matter, anything else that's "ugly" or could be improved. I am getting more and more into Prolog, after 40+ years of telling computers what to do, I think I am prepared to now tell them when things are true and correct and let them figure it out!
In the meantime I will continue to track down the problem, I have all the good books, so I should eventually be able to deduce and solve the true nature of the problem myself....shouldn't I??
Thanks very much in advance,
Users-prolog mailing list
|Free forum by Nabble||Edit this page|