![]() ![]() The videos were taking forever to load up, and it wasn’t the buffering that was the problem, it was connecting initially to the file. ram file into a link to a dynamic page I created to embed the Real file into, thus allowing it to play on the kiosk. Privoxy to came to the rescue again, changing every link of a. wKiosk does not allow this as it takes over the entire OS, rendering other apps inopperable (which is great as a security feature, but not for usability of the Channel). We normally require users to load video into the Real Player to view it. He found another open source app named Privoxy that does just this, and it works great! Now we can happily surf around on the kiosk without the fear of busting out of our kiosk frameset.Īs we poked around some more, we found issue with our Walker Channel. Nate had the idea of dyanmically rewriting the HTML as it was requested, changing all targets to target our main content frameset instead of _top. We simply can’t change the live sites, but we can’t live with thier current implimentation for the kiosks either. Clicking on said links when browsing these sites gets rid of this featured link nav I set up for the kiosk. The problem was that there are many parts of our website (both old and new) that use hrefs with target=”_top”, which of course breaks you out of a frameset. To do this I’ve made a frameset that we can modify, that allows the user to click and view any pages we decide to feature. ![]() We’re giving the user featured links on the left hand side of the screen. The initial setup was very quick and wKiosk is fairly simple to use, and it really does lock down the machine well. For that we’re using the open source software SquidMan. We’ve also decided to run a proxy server on the machines, to cache content and make the machines faster. We’ll be using two iMac G5’s as our kiosk stations, and the kiosk software we’ve chosen is wKiosk. This way, squid will see plain http traffic that it can inspect and cache, and you can still access https only conda repositories.We’re working on creating two web kiosks for the Vineland Lobby in the new building, and Nate and I have been creating these for the last couple days. You could put your squid caching proxy in front of this http-https proxy. ![]() You could set up a proxy server that accepts http requests and passes them on as https requests. conda allows you to specify channels with an http protocol. conda files from a local machine into the pkgs_dir for each machine.Īdd a second layer of proxying. For multiple machines, maybe you could get by with a network share pkgs_dir or copying all the. For a single machine, the conda pkgs_dir should work pretty well. ![]() Since you referenced conda clean in your question, you are aware of this cache and must have some reason for not using it. Use conda's built-in support for local caching. Sonatype's Nexus repository manager also claims to proxy conda repositories in its documentation. offers such a server as a product, so you are unlikely to find much built-in support for this in the open source conda tools. Getting this set up is somewhat tricky because you have to generate a self-signed certificate and get it trusted by conda (using conda install -insecure might be all that is needed for conda). Use squid's ssl bump feature to have squid decrypt and re-encrypt the data passing through it. Options that you have available to you are: This traffic is already encrypted, so it can not be cached. In a basic configuration, squid can only pass SSL connections through from the client to the server. However, conda uses https to download its packages. It sounds like you likely have configured everything properly for proxying traffic through squid. How do I configure squidman to work with conda ? So I dialled those settings up to the max (well bigger than the conda download) and tried again. There is a "maximum object size", maybe that prevents the conda downloads from being cached. Those are the same addresses as in the system proxy settings - which seem to work fine for browsing.Īnd hoping to see very fast download speeds for those python packages. This happens for both http and https websites and also if I enter an IP directly (no DNS in between). If I switch it off and try to browse the internet, I get an error message "The proxy server is refusing connections". I think that SquidMan is set up correctly.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |