Most of the times i use htop to check cpu threads usage in linux. I have quite a few freebsd servers, so i ran into some issues when installing htop on vanilla freebsd. Well, you need to make some adjustments so that htop works, so here it goes:
First you have to dynamically load the linux object in the kernel:
Then we have to make this loading permanent so addlinux_enable=”YES” to /etc/rc.conf.
After this, install a linux layer:
cd /usr/ports/emulators/linux_base-fc4 (for some strange reason fc6 is not working for me)
make install distclean
Go to /etc/fstab and add the following line:
linproc /compat/linux/proc linprocfs rw 0 0
Mount the new filesystem: mount linproc and go to/usr/ports/sysutils/htop and install as usual.
Note to self: Daca ai DNS cluster, ora pe servere trebe tinuta in sync, altfel zonele nu isi fac update, cand faci sincronizarea.
1:43 – Am terminat de adaugat SPF records pentru mailurile care vin din serverul web. A bitch it is. Se pare ca de pe la sfarsitul lui 2006, Yahoo, MSN, Gmail, si alte servicii mari, au facut obligatorie chestia cu SPF records, sunt niste entries care se adauga in zona de DNS a nameserverului care face management la domeniul respectiv, si atunci cand se trimite mail, recipientul verifica daca sunt spf records active, daca nu sunt, Mailurile sunt trimise inapoi, si asa ajungi sa pierzi comunicatia cu userii de pe Yahoo, MSN etc.
Anyways, solutia e in felul urmator:
In zona de DNS a domeniului se adauga:
domain.com.Ã‚Â Ã‚Â Ã‚Â Ã‚Â Ã‚Â Ã‚Â Ã‚Â 14400Ã‚Â Ã‚Â INÃ‚Â Ã‚Â Ã‚Â Ã‚Â Ã‚Â TXTÃ‚Â Ã‚Â Ã‚Â Ã‚Â “v=spf1 ip4:126.96.36.199/24 mx ptr a:hostname include:optional_hostname -all”
188.8.131.52/24 = clasa de ips de pe care ar fi posibil sa se trimita mail
hostname = hostul serverului
optional_hostname= daca aveti al doilea mailserver
pentru a testa daca functioneaza, http://senderid.espcoalition.org/results.php folositi aceasta procedura.
Acum brb implementez DomainKeys.
[warn] (2)No such file or directory: Failed to enable the ‘httpready’ Accept Filter
if this error shows up on freebsd, with apache 2 trying to start, use the : kldload accf_http , and then restart apache.
Dell confirmed on Wednesday plans to offer Linux pre-installed on select desktop and notebook systems, beyond its current Linux-based servers and Precision workstations. The decision comes after feedback on its IdeaStorm site and a survey that garnered over 100,000 responses.
No specific timeframe was given for the expanded Linux plans, although the company said in a blog posting that it will provide an update in the coming weeks regarding the effort. It will detail “information on which systems we will offer, our testing and certification efforts, and the Linux distribution(s) that will be available,” Dell said, adding that, “The countdown begins today.”
We’ve seen plenty of crazy ways to keep your precious data safe. Some people burn a few tons of DVDs, others make a montly habit of swapping hard drives into a safe location. In today’s How-To we’ll show you how to automatically keep your data backed up from your computer with ssh and rsync. Feel that? That’s our warm comfy safe-data blankie. Check it out.
What about backup software? There are so many flavors of software, only Google could count them all. Since we like our choice of operating systems, we want something that’ll work from our Mac, Windows or Linux machines. But we’ll cover some good software backup options next times — for now it’s the down and dirty, nitty gritty network backups.
First of all, we’re going to need somewhere to keep our data. For the tools we’ll be using, we’ll need a server that we can access via secure shell (ssh) from anywhere on the net. If you only want to backup your stuff at home, that’s fine.
For our first example, we’ll be using Ubuntu Linux on our laptop with our Linux web server. You can get an inexpensive shared host from a web host provider or roll your own like we did. Our particular backup solution is to update a copy of our data as rather than take incremental snapshots over time. You can do it either way, but for our needs, we just want our current data set kept alive.
Once you’ve decided where to keep your data, and what you want to backup on your laptop or workstation, you’ll need the tools to keep things rolling.
The heart of our — and many others’ — cross platform backup is a combination of ssh and rsync. The secure shell is probably the most useful networking application ever. We’ll use it to transport our data securely to our backup location. To update the files, rsync will be used — it’s designed to copy and synchronize data from one location to another.
For our example, we’ll backup /home/willo/data to our server. The top directory ‘data’ should be created on the server. We’ll use an Ubuntu laptop, if they’re not already installed, you can easily install rsync and ssh with this command:
sudo apt-get install rsync openssh-client
This command will copy and update the data inside /home/willo/data to our server’s directory /home/willo/data. When it’s run by hand, it requires the ssh password for user willo on the server. Not a big deal, but when it’s automated, we won’t be there to enter the password.
rsync -avz -e ssh data [email protected]:/home/willo
To get around the password requirement, we need to create a pair of ssh keys. The keys will allow our ssh connections without user intervention. (This also means that someone else could connect if they get your key…)
Here’s the command for copy/paste ease:
ssh-keygen -t dsa -b 2048 -f /home/willo/backup-key
The command creates a pair of keys. The private key will allow us to open our connection. We’ll need to copy the public key to our destination/backup server. Once it’s there, we added it to our authorized keys file. If you don’t have one, just rename the file instead of appending it.
Now we can run our backup command without a password, and a simple command. Again, here it is for quick cut and paste:
rsync -avz -e “ssh -i /home/willo/backup-key” /home/willo/data [email protected]:/home/willo/data
As always, replace the paths at will. (Har har.)
Now we’ll put our backup command into a script so we don’t have to remember every detail. This is the quick and dirty version. We ran vi backup.sh and added a bash header and our script. We’ll save it as backup-data.sh and run chmod 500 backup-data.sh so we can execute it, but other users can’t look at it.
Cron is a scheduling program. We can schedule software to run as often, or as rarely as we want. To regularly run our backup program, we’ll create a cron job. Use the command crontab -e to edit your crontab. The first five entries determine how often to run the job, and the command follows. In this case, we’ll sync our data every 30 minutes.
Now you know how to do it from a linux box, but how about Windows? You can do the same thing by installing cygwin – it’s a set of unix tools built to run under the Windows environment. Download the installer here.
Run the installer and step through the process. When you get to the package selection window, you’ll need to select cron, ssh and rsync.
Once the installer finishes, open up the cygwin bash shell program. From here you’ll be able to perform the same steps we outlined for linux. The only real difference is in the directory names. We suggest putting your data in an easy location like C:\data. Then you can use /cygdrive/c/data in your commands instead of the usual c:\data.
You can do the same things under Mac OS X, all the tools are already installed. Just navigate to Applications/Utilites and open up the Terminal program. After that, you can follow the same instructions. A work of warning: rsync’s support of resource forks has been an issue. You’ll probably want to look into using rsyncx. If you’re dealing with simple data like image files, normal rsync should get the job done.
Now that you know how to keep your data backed up to a server, where should it go? Well, how about to an off the shelf NAS like the Buffalo Terastation? With a few modifications, we used the same solution with ours.
A visit to the Terastaion wiki turned up a few hacks that opened up the boxes latent abilities. We installed firmware from this page to gain telnet and root access to the box. The updater is pretty large, but it worked just fine for us.
After the update, we opened a telnet session to our Terastaion. (We gave our a static ip and set the gateway and DNS settings.) To quickly and easily install ssh, we used the following commands logged in as myroot:
tar zxv /home/dropbear.tgz
With ssh running, we created a user using the normal control panel. All users are defaulted to home, but you can edit /etc/passwd and provide something like /home/willo if you want to keep things separate. Create /home/willo/.ssh, copy the public key to authorized_keys. After that, decide where to keep your backup. Don’t use the home directory — put it under one of the normally shared directories under /mnt/array1.
If the worst should happen, or you just need a copy of your data, you can snag it using a couple of tricks. To securely copy the whole shebang, scp is the easiest. (scp – secure copy is built into openssh) You can use a key, but your password will work just as well.
If you’ve got most of your data on another machine, but want to update it with the latest changes from your working copy, you can use rsync, but reverse the source and destination. Again, you can use the key, or not. It’s your choice. (Just don’t try to run this by and and have the crontab rsync running as well.)
It’s important to keep the limitations of this method in mind. If you’re working with huge files, then you’ll need some major bandwidth. You probably won’t get that 2GB file copied to the server from the local coffee shop’s DSL connection. Thanks to rsync, you can pretty easily add or update smaller data files. Even if the upload doesn’t complete, rsycnc will pick up where it left off the next time it connects. Keep your eyes out for part two, when we’ll look at a few backup options that don’t require a terminal to use.
Ã‚Â Panda has released NanoScan, an online virus scanning service that is able to perform a full sweep of a computer in less than one minute. The speed is a vast improvement over current virus scanners, which take as much as an hour or more to complete.
The company isn’t giving specifics on how the software works, only saying that it will require a small 400KB ActiveX download. No software is installed on the user’s computer, and is hosted on Panda’s servers. This would ensure that the signature files were continually up-to-date.
Hosting the signature file online solves a problem that the company said will eventually require a new way to combat virus and malware writers.
“Panda had foreseen that digital vandals and Internet criminals would eventually win the day simply by overwhelming systems with signature files too large to be of practical use, unless something radically new and different were done,” the company said in a statement.
Around 600,000 threats will be detectable through the service, with more added daily through the company’s ‘Anti-Malware Collective Intelligence’ platform. The system uses detection of new threats worldwide as a way to keep its anti-virus signature files continuously up-to-date.
This system works hand-in-hand with its TruPrevent technology, the company said, which detects malicious code without the need for it to be in the antivirus softwares signature file.
In beta, the scanner is available for use free of charge from nanoscan.com. It was not immediately clear if the company will charge for the final version.
Ã‚Â “In response to overwhelming user demand for Linux, Dell has posted a survey on a company blog that asks ‘PC users to choose between Linux flavors such as Fedora and Ubuntu, and to pick more general choices such as notebooks versus desktops, high-end models versus value models and telephone-based support versus community-based support.’ Votes will be collected through March 23, and Dell plans to use the feedback to begin selling Linux-based consumer PCs.” The poll is pretty minimal. Wonder how much it will really guide Dell’s choices.
The ClamAV team is proud to announce the long awaited ClamAV 0.90. This version introduces lots of new interesting features and marks a big step forward in the development of our antivirus engine.
The 0.9x series introduces lots of improvements in terms of detection rate and performance, like support for many new packers and decryptors, RAR3 and SIS archives, and a new phishing signatures format that proves to be very effective.
One of the most important changes is the availability of scripted updates. Instead of transferring the whole cvd file at each update, only the differences between the latest cvds and the previous versions will be transferred.
In case the local copy of the latest cvd is corrupted or the scripted update fails for some reason, freshclam will fallback to the old method.
Similarly to cvd files, scripted updates are compressed and digitally signed and are already being distributed. They will dramatically reduce traffic on our mirrors and will allow us to release even more updates in the future.
Another noticeable change is the new configuration syntax: you can now turn single options on and off, the old crude hack of Ã¢â‚¬Å“DisableDefaultScanOptionsÃ¢â‚¬Â is no longer required.
Cosmetic changes apart, the 0.9x series introduces lots of new code, but some parts are not compiled in by default because they are not ready for production systems yet. You are encouraged to pass theÃ¢â‚¬â€enable-experimental flag to ./configure when compiling ClamAV. The experimental code introduces many improvements in terms of detection rate and performances.
If you find a bug, please take some time to report it on our bugzilla.
Your help in testing the new code is really appreciated. The experimental code introduces many improvements in terms of detection rate and performances.
RAR3, SIS and SFX archives support is finally available together with new unpackers and decryptors: pespin, sue, yc, wwpack32, nspack, mew, upack and others. Additionally, ClamAV now includes better mechanisms for scanning ELF, PDF and tar files. The email decoding has been improved to reduce both the memory requirements and the time taken to process attachments.
As part of the Google Summer of Code program, we have introduced support for a new phishing signatures format that has proved very effective in detecting phishing emails. The ClamAV phishing module allows better and more generic detection of phishing emails by searching for URLs in email messages, and comparing the real site with the URL displayed to the user in the message.
On the performance side, support for the MULTISCAN command has been implemented in clamd, allowing to scan multiple files simultaneously.
Support for Sensory NetworksÃ¢â‚¬â„¢ NodalCore acceleration technology is now available in ClamAV and will be compiled in if the ncore libraries are detected at compile time. NodalCore acceleration allows highly improved scan speeds on systems equipped with NodalCore cards.
Malware coverage has been extended a lot during the last few months thanks to the hard work of our sigmakers . At the moment ClamAV Virus Database contains over 90.000 signatures and the number keeps increasing day by day.
Today we are also launching our new website. We strived to make it w3c compliant so it should be easy to navigate from just any browser. Feel free to send any suggestion or report problems to Luca .
Thanks to the people from our newly formed polyglot team, the website is available in multiple languages. We look forward to add even more languages in the next few months, in particular Russian and French; we would also like to translate the full ClamAV documentation. If you are interested in helping us with this task, please donÃ¢â‚¬â„¢t hesitate to contact Luca .