To avoid further confusion: The krb5 PHP extension is now hosted on PECL. There now is a public SVN repository under http://svn.php.net/repository/pecl/krb5/trunk/ which contains some bug fixes and improvements over the here published rc2 version. Please use the SVN version.
Also feel free to use the PECL bug tracker if you experience any issues.
UPDATE: The library location issues that many people suffered from should now be fixed. Sorry it took so long.
- [BUG] Fixed two memory corruption bugs (NegotiateAuth::doAuthentication, GSSAPIContext::inquireCredentials)
- [BUG] Fix usage of krb5_random_confounder which is removed as of MIT krb5-1.8
- [FEATURE] Added GSSAPIContext::registerAcceptorIdentity to register a keytab from which credentials are fetched
- [BUILDBUG] Added /usr/include/et to include path to fix kadmin headers
- [BUILDBUG] Also scan lib64/ directories for kerberos libraries
- kadmin API is now officially exported (starting with 1.7), but slighlty changed ... need to fix this
- Apache does not provide the authentication header via the default mechanism, but it would be possible to obtain it (circumvent this issue by using a rewrite rule which passes the Authorization header).
- anything I have possibly missed....
Thanks to those who reported the issues. Please let me know if I missed some of you issues.
Also I'm propably going to sign up for a PECL account, then a public SVN and bug tracker will be available.
It has been a really long time since I last wrote anything about it - but I had the chance to work on my Kerberos PHP extension earlier this year. There have been some API changes and one big new feature: It now contains bindings for GSSAPI functions which might be really useful for people implementing kerberized protocols in PHP. Also API documentation is now included and credential cache management has been changed to better work in web environments.
It took me literally tens of hours to figure out how to do SPNEGO proxy authentication for JAVAs builtin HTTP routines. So let me share my results:
Documentation sucks as hell
Documentation sucks as hell, and there is plenty of it
JAVAs implementation of GSSAPI and Kerberos:
a) fail to establish a security context with my MIT krb services (I was neither able to authenticate agains mod_auth_kerb and my squid negotiate helper),
b) are a nightmare to configure and
c) seem to be unable (without really ugly hacks) to obtain the credentials from MIT's default credential cache (are not using KRB5CCNAME env)
During the last days I had some time to work on different things that were in the working queue for quite some time:
First of all, I had the chance to look at the possibilities for OpenAFS web-administration again. As I might have written before, the libadmin library is such a mess in terms of documentation and I really did not want to dig into this - so I wanted to try out what was possible using JAFS. Until now there was no support for use in non-KAS cells but then - some weeks ago - there was a post on openafs-dev about using it in a K5 cell (and also fixing the code for compiling with java 1.5) so this was the chance to try it out. It really took some time to get it kind of working - but finally (after patching out the various kas_ calls that were still in the code - which made the code block almost infinitly, and some other patches to make it compile) it did. After that I was almost unable to belive that the php-java-bridge was working perfectly out of the box. So I could take phpSATk build some objects and definitions around the interface (this unfortunatly was a bit more complicated than the other things I build so farr because I had to do the right type casts in the wrapper) and hell yeah ... now it seems to work. And I updated my php_afs extension to only take a kerberos ticket (from php_krb5) and perform "aklog"-ging to the afs cell to have the right token in kernel.
This might be a pretty much glued together solution (and there are some problems like error handling which are not that nice), but yes - IT WORKS!!
The other thing I had been working on is the ICAP server (which still needs some unique name ) which made some great progress. It now supports previews and persistent connections, has the base for a parser to extract text tokens out of the source for content filtering and the score management is to be replaced by a real solutions soon. Squid3 is (marked) stable now - I do have it installed in a production environment already and the only thing on the bug list right now is a annonying ICAP bug to be reported) - so it might be a good idea to get it working soon. (Help is always appreciated)
I have not had time to work on phpSATk for quite a while (there will be some progress soon ... promised), the squid negotiate helper is kind of ready (I have been using it in a test environment for some time now), there will soon be a publicly available version (the impatient might have a look at the svn version) and my PHP CRL patch will propably included in PHP HEAD as soon as I manage to put together some test cases and possibly 5.3 afterwards.
But the project I spent the most time on in the last weeks was building a ICAP based web filtering engine. In the past I felt like all existing (open source) web filters have major shortcomings:
- completely relying on URL black-/whitelists totally sucks - the number of false positives and false negatives is extremely high, higher quality blacklists are expensive, you never know what the political/commercial/whatever interests of people/institutions putting sites on these blacklists are.
- the only free content filtering solution I know about is dansguardian which relies on proxy chaining what sucks when it comes to authentication. This approach also is not as flexible as I'd like it to. The licensing terms are imho a bit too restrictive.
- All solutions I know are not really configurable at run time. In production use I need to possibility to make online changes to the black/whitelists and or wordlists without causing connections to disrupt and/or increased latency (I'd consider writing to configuration files and/or black-/whitelists from a webapp inacceptable).
so I'm trying to build a solution having the following properties:
- uses the ICAP standard (squid3 is coming...)
- will be probably licensed under GPL
- will be a hybrid solution combining the results of content analysis and url filtering
- content analysis should include reliable word/phrase matching as well as parsing of PICS tags.
- will be based on scores (one for each category) which will be used to match a profile of allowed sites
- contains efficient "database engines" for the different datatypes used - each of them manageable in real-time through an RPC (XMLRPC for now) interface.
- should be extremely fast (it already uses threads and asynchronous i/o) and scaleable
- ... will integrate with phpSATk as administrative interface
So far I have a working prototype which can already deny/allow access based on: server ip (this is really useful for some popular sites which have hundreds of aliases), host/domainname and regular expressions. Both parsing PICS tags as well as reliable and fast phrase/word matching are very hard to implement - so maybe this will need some time until I can show something working.
I'll announce this thingy (and the negotiate/GSS helper) to the squid users/dev soon so maybe somebody volunteers to contribute.
Update: squid-3.0-stabe1 is released ... its time to have this thingy working ... the icap protocol implementation is kind of feature complete now (previews, persistent connection are implemented)