Jump to content
Slate Blackcurrant Watermelon Strawberry Orange Banana Apple Emerald Chocolate Marble
Slate Blackcurrant Watermelon Strawberry Orange Banana Apple Emerald Chocolate Marble

NickTheGreek

Administrators
  • Content Count

    452
  • Joined

  • Last visited

  • Days Won

    76
  • Feedback

    N/A

Everything posted by NickTheGreek

  1. from cPanel University https://support.google.com/mail/answer/81126?hl=en http://www.ietf.org/rfc/rfc2505.txt https://help.yahoo.com/kb/SLN3435.html
  2. This post is reposted from the Microsoft Azure Blog : What is Artificial Intelligence? <azure.microsoft.com/blog/what-is-artificial-intelligence/> Aug 9th 2018, 12:00, by Theo van Kraay It has been said that Artificial Intelligence will define the next generation of software solutions. If you are even remotely involved with technology, you will almost certainly have heard the term with increasing regularity over the last few years. It is likely that you will also have heard different definitions for Artificial Intelligence offered, such as: *“The ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings.”* – Encyclopedia Britannica *“Intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans.”* – Wikipedia How useful are these definitions? What exactly are “tasks commonly associated with intelligent beings”? For many people, such definitions can seem too broad or nebulous. After all, there are many tasks that we can associate with human beings! What exactly do we mean by “intelligence” in the context of machines, and how is this different from the tasks that many traditional computer systems are able to perform, some of which may already seem to have some level of *intelligence* in their sophistication? What exactly makes the *Artificial Intelligence* systems of today different from sophisticated software systems of the past? It could be argued that any attempt to try to define “Artificial Intelligence” is somewhat futile, since we would first have to properly define “intelligence”, a word which conjures a wide variety of connotations. Nonetheless, this article attempts to offer a more accessible definition for what passes as Artificial Intelligence in the current vernacular, as well as some commentary on the nature of today’s AI systems, and why they might be more aptly referred to as “intelligent” than previous incarnations. Firstly, it is interesting and important to note that the technical difference between what used to be referred to as Artificial Intelligence over 20 years ago and traditional computer systems, is close to zero. Prior attempts to create intelligent systems known as *expert systems* at the time, involved the complex implementation of exhaustive rules that were intended to approximate* intelligent behavior*. For all intents and purposes, these systems did not differ from traditional computers in any drastic way other than having many thousands more lines of code. The problem with trying to replicate human intelligence in this way was that it requires far too many rules and ignores something very fundamental to the way *intelligent beings* make *decisions*, which is very different from the way traditional computers process information. Let me illustrate with a simple example. Suppose I walk into your office and I say the words “Good Weekend?” Your immediate response is likely to be something like “yes” or “fine thanks”. This may seem like very trivial behavior, but in this simple action you will have immediately demonstrated a behavior that a traditional computer system is completely incapable of. In responding to my question, you have effectively dealt with ambiguity by making a prediction about the correct way to respond. It is not certain that by saying “Good Weekend” I actually intended to ask you whether you had a good weekend. Here are just a few possible* intents* behind that utterance: – Did you have a good weekend? – Weekends are good (generally). – I had a good weekend. – It was a good football game at the weekend, wasn’t it? – Will the coming weekend be a good weekend for you? And more. The most likely intended meaning may seem obvious, but suppose that when you respond with “yes”, I had responded with “No, I mean it was a good football game at the weekend, wasn’t it?”. It would have been a surprise, but without even thinking, you will absorb that information into a mental model, correlate the fact that there was an important game last weekend with the fact that I said “Good Weekend?” and adjust the probability of the expected response for next time accordingly so that you can respond correctly next time you are asked the same question. Granted, those aren’t the thoughts that will pass through your head! You happen to have a neural network (aka “your brain”) that will absorb this information automatically and *learn* to respond differently next time. The key point is that even when you do respond next time, you will still be making a prediction about the correct way in which to respond. As before, you won’t be certain, but if your prediction *fails* again, you will gather new data which leads to my definition of Artificial Intelligence: “Artificial Intelligence is the ability of a computer system to deal with ambiguity, by making predictions using previously gathered *data*, and learning from errors in those predictions in order to generate newer, more accurate predictions about how to behave in the future”. This is a somewhat appropriate definition of Artificial Intelligence because it is exactly what AI systems today are doing, and more importantly, it reflects an important characteristic of human beings which separates us from traditional computer systems: human beings are prediction machines. We deal with ambiguity all day long, from very trivial scenarios such as the above, to more convoluted scenarios that involve *playing the odds* on a larger scale. This is in one sense the essence of *reasoning*. We very rarely know whether the way we respond to different scenarios is absolutely correct, but we make reasonable predictions based on past experience. Just for fun, let’s illustrate the earlier example with some code in R! First, lets start with some data that represents information in your mind about when a particular person has said “good weekend?” to you. In this example, we are saying that *GoodWeekendResponse* is our *score label* (i.e. it denotes the appropriate response that we want to predict). For modelling purposes, there have to be at least two possible values in this case “yes” and “no”. For brevity, the response in most cases is “yes”. We can fit the data to a logistic regression model: library(VGAM) greetings=read.csv(‘c:/AI/greetings.csv’,header=TRUE) fit <- vglm(GoodWeekendResponse~., family=multinomial, data=greetings) Now what happens if we try to make a prediction on that model, where the expected response is different than we have previously recorded? In this case, I am expecting the response to be “Go England!”. Below, some more code to add the prediction. For illustration we just hardcode the new input data, output is shown in bold: response <- data.frame(FootballGamePlayed=”Yes”, WorldCup=”Yes”, EnglandPlaying=”Yes”, GoodWeekendResponse=”Go England!!”) greetings <- rbind(greetings, response) fit <- vglm(GoodWeekendResponse~., family=multinomial, data=greetings) prediction <- predict(fit, response, type=”response”) prediction index <- which.max(prediction) df <- colnames(prediction) df[index] * No Yes Go England!! 1 3.901506e-09 0.5 0.5 > index <- which.max(prediction) > df <- colnames(prediction) > df[index] [1] “Yes”* The initial prediction “yes” was wrong, but note that in addition to predicting against the new data, we also incorporated the actual response back into our existing model. Also note, that the new response value “Go England!” has been *learnt*, with a probability of 50 percent based on current data. If we run the same piece of code again, the probability that “Go England!” is the right response based on prior data increases, so this time our model *chooses* to respond with “Go England!”, because it has finally learnt that this is most likely the correct response! * No Yes Go England!! 1 3.478377e-09 0.3333333 0.6666667 > index <- which.max(prediction) > df <- colnames(prediction) > df[index] [1] “Go England!!”* Do we have Artificial Intelligence here? Well, clearly there are different *levels* of intelligence, just as there are with human beings. There is, of course, a good deal of nuance that may be missing here, but nonetheless this very simple program will be able to react, with limited accuracy, to data coming in related to one very specific topic, as well as learn from its mistakes and make adjustments based on predictions, without the need to develop exhaustive rules to account for different responses that are expected for different combinations of data. This is this same principle that underpins many AI systems today, which, like human beings, are mostly sophisticated prediction machines. The more sophisticated the machine, the more it is able to make accurate predictions based on a complex array of data used to *train* various models, and the most sophisticated AI systems of all are able to continually learn from faulty assertions in order to improve the accuracy of their predictions, thus exhibiting something approximating human *intelligence*. Machine learning You may be wondering, based on this definition, what the difference is between *machine learning* and *Artificial intelligence*? After all, isn’t this exactly what machine learning algorithms do, make predictions based on data using statistical models? This very much depends on the definition of *machine learning*, but ultimately most machine learning algorithms are* trained* on static data sets to produce predictive models, so machine learning algorithms only facilitate part of the dynamic in the definition of AI offered above. Additionally, machine learning algorithms, much like the contrived example above typically focus on specific scenarios, rather than working together to create the ability to deal with *ambiguity* as part of an *intelligent system*. In many ways, machine learning is to AI what neurons are to the brain. A building block of intelligence that can perform a discreet task, but that may need to be part of a composite *system* of predictive models in order to really exhibit the ability to deal with ambiguity across an array of behaviors that might approximate to *intelligent behavior*. Practical applications There are number of practical advantages in building AI systems, but as discussed and illustrated above, many of these advantages are pivoted around “time to market”. AI systems enable the embedding of complex decision making without the need to build exhaustive rules, which traditionally can be very time consuming to procure, engineer and maintain. Developing systems that can “learn” and “build their own rules” can significantly accelerate organizational growth. Microsoft’s Azure cloud platform offers an array of discreet and granular services in the AI and Machine Learning domain <docs.microsoft.com/en-us/azure/#pivot=products&panel=ai>, that allow AI developers and Data Engineers to avoid re-inventing wheels, and consume re-usable APIs. These APIs allow AI developers to build systems which display the type of *intelligent behavior* discussed above. If you want to dive in and learn how to start building intelligence into your solutions with the Microsoft AI platform, including pre-trained AI services like Cognitive Services and the Bot Framework, as well as deep learning tools like Azure Machine Learning, Visual Studio Code Tools for AI, and Cognitive Toolkit, visit AI School <aischool.microsoft.com/learning-paths>.
  3. despite proper DKIM/SPF setup and appropriate Spam Assassin configuration as well as cPanel > Mail section > Default Address set to Current Setting: :fail: No Such User Here I have a friend receiving self sent emails that are obviously spam You would think this is common and easy to tackle but ... your could not be more wrong than that ! https://forums.cpanel.net/threads/self-sent-spam.334831/ https://forums.cpanel.net/threads/spam-sent-to-self.608551/ https://luxsci.com/blog/save-yourself-from-yourself-stop-spam-from-your-own-address.html So, what do we recommend? The simplest way to take care of this situation is to: Use Email Filtering systems that treat SPF and DKIM properly, to stop this kind of spam. Make sure that any catch-all email aliases are turned off (the ones that accept all email to unknown/undefined addresses in your domain and deliver them to you anyway — these are giant spam traps). Make sure that your email address and your domain name are NOT on your own Spam Filter allow or white list(s). Make sure that, if you are using your address book as a source of addresses to allow, that your own address is NOT in there (or else don’t white list your address book). Add the Internet IP address(es) of the servers from which you do send email to your allow list, if possible. Contact your email provider for assistance in obtaining this list and updating your filters with it. Add SPF to your domain’s DNS. Make it strict (i.e. “-all”) Use DKIM. Make it strict (i.e. “dkim=discardable”). See our DKIM Generator. Setup DMARC to enable servers to properly handle SPF and DKIM failures. Consider using Authenticated Received Chain (ARC) once it is available to you. It will provide further levels of validation to handle problems with SPF and DKIM. If you want to go further, consider use of technologies such as PGP or S/MIME for cryptographic signing of individual messages and consider “closed” email systems … where only the participants can send messages to each other.
  4. What's new in version 4.3.5 × Version 4.3.5 is a small maintenance update to fix issues reported since 4.3.4. Also included: 4.3.4 Added a filter to view members that have opt-in for bulk mail in the ACP, and an option to opt-out those members Bug fixes
  5. http://www.youronlinechoices.com/gr/your-choices Διαχείριση Επιλογών Παρακάτω θα βρείτε μερικές από τις εταιρείες που συνεργάζονται με ιστοσελίδες για να συλλέξουν και να χρησιμοποιήσουν πληροφορίες για να παρέχουν συμπεριφορική διαφήμιση. Παρακαλούμε χρησιμοποιήστε τα κουμπιά για να ρυθμίσετε τις προτιμήσεις σας. Μπορείτε να ενεργοποιήσετε ή να απενεργοποιήσετε όλες τις εταιρείες, ή να ρυθμίσετε τις προτιμήσεις σας για συγκεκριμένες εταιρείες. Κάνοντας κλικ στο μπορείτε να μάθετε περισσότερα για τη συγκεκριμένη εταιρεία καθώς και αν είναι ενεργοποιημένη ή όχι στον browser που χρησιμοποιείτε. Αν αντιμετωπίζετε προβλήματα, παρακαλούμε συμβουλευθείτε τη Βοήθεια.
  6. These settings apply when you're using this browser and device SIGN IN to control settings for personalized ads across all of your browsers and devices Ads Personalization on Google Search See more useful ads when you're using Google Search The way Google saves your ad settings has changed. Learn more about how Google uses cookies for ad personalization Ads Personalization Across the Web See more useful ads on YouTube and the 2+ million websites that partner with Google to show ads https://adssettings.google.com/anonymous
  7. We ran into an interesting MySQL character encoding issue at Crowd Favorite today while working to upgrade and launch a new client site. Here is what we were trying to do: copy the production database to the staging database so we could properly configure and test everything before pushing the new site live. Pretty simple right? It was, until we noticed a bunch of weird character encoding issues on the staging site. It turned out that while the database tables were set to a Latin-1 (latin1), the content that populated those tables was encoded as UTF-8 (utf8). A variety of attempts to fix this failed, but what succeeded was as follows: Export the data as Latin-1. Because MySQL knows that the table is already using a Latin-1 encoding, it will do a straight export of the data without trying to convert the data to another character set. If you try to export as UTF-8, MySQL appears to attempt to convert the (supposedly) Latin-1 data to UTF-8 – resulting in double encoded characters (since the data was actually already UTF-8). Change the character set in the exported data file from ‘latin1’ to ‘utf8’. Since the dumped data was not converted during the export process, it’s actually UTF-8 encoded data. Create your new table as UTF-8 If your CREATE TABLE command is in your SQL dump file, change the character set from ‘latin1’ to ‘utf8’. Import your data normally. Since you’ve got UTF-8 encoded data in your dump file, the declared character set in the dump file is now UTF-8, and the table you’re importing into is UTF-8, everything will go smoothly. I can confirm that a half-dozen or so variations on the above do not work. This includes INSERT INTO newdb.newtable SELECT * FROM olddb.oldtable;. Also, if you’re doing this for a WordPress1 site (like we were), keep in mind that copying over the production database will generally mean that WP-Cache is enabled. You’ll want to remember to turn that off. Yeah. This is a fairly common issue in older WordPress installs because the MySQL database default is commonly Latin-1, and older versions of WordPress did not specify the character set when creating the database tables (so they would default to Latin-1) and the default encoding in the WordPress settings is UTF-8. [back] http://alexking.org/blog/2008/03/06/mysql-latin1-utf8-conversion
  8. Since the Poodle vulnerability (SSLv3) a number of clients disabling SSLv3 on CentOS 5 breaks compatibility with external sites and applications such as WHMCS and PayPal IPN. This is because TLS1.0 will be the only supported method. In order to support the TLS1.1 and TLS1.2 you can follow the steps below to force the use of the newer version of openssl: First we need to get the latest openssl version (all links provided in this article are the latest at the time of writing) wget 'http://www.openssl.org/source/openssl-1.0.1j.tar.gz' tar -zxf openssl-1.0.1j.tar.gz cd openssl-1.0.1j ./config shared -fPIC make make install Install latest curl to /usr/local/ssl rm -rf /opt/curlssl wget 'http://curl.haxx.se/download/curl-7.38.0.tar.gz' tar -zxf curl-7.38.0.tar.gz cd curl-7.38.0 ./configure --prefix=/opt/curlssl --with-ssl=/usr/local/ssl --enable-http --enable-ftp LDFLAGS=-L/usr/local/ssl/lib CPPFLAGS=-I/usr/local/ssl/include make make install Now we need to configure EasyApache to use what we’ve done, we will do this by creating two files. cd /var/cpanel/easy/apache/rawopts touch all_php5 touch Apache2_4 Edit all_php5 in your favourite text editor --enable-ssl --with-ssl=/usr/local/ssl --with-curl=/opt/curlssl LDFLAGS=-L/usr/local/ssl/lib CPPFLAGS=-I/usr/local/ssl/include Edit Apache2_4 in your favourite text editor --with-ssl=/usr/local/ssl LDFLAGS=-L/usr/local/ssl/lib CPPFLAGS=-I/usr/local/ssl/include Go into WHM goto EasyApache, Select build from current profile or customise as you require. Once completed you now have TLS 1.2 that will survive upgrades! For forwarding secrecy and high encryption ratings add the following from WHM > Apache Configuration > Include Editor > Pre VirtualHost Include, choose either all versions or your current version and paste the below code into the box SSLProtocol -SSLv2 -SSLv3 +TLSv1.2 +TLSv1.1 +TLSv1 SSLCompression off SSLHonorCipherOrder on SSLCipherSuite ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES256-SHA:DHE-RSA-AES256-GCM-SHA384:AES256-GCM-SHA384:DHE-RSA-AES256-SHA256:DHE-RSA-CAMELLIA256-SHA:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES128-SHA256:AES128-GCM-SHA256:DHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA:!NULL:!eNULL:!aNULL:!DSS:-LOW:RSA+RC4+SHA https://www.gbservers.co.uk/2014/10/19/centos-5-tls-1-2-support-cpanelwhm/
  9. edit /etc/yum.repos.d/CentOS-Base.repo and replace with # CentOS-Base.repo # # The mirror system uses the connecting IP address of the client and the # update status of each mirror to pick mirrors that are updated to and # geographically close to the client. You should use this for CentOS updates # unless you are manually picking other mirrors. # # If the mirrorlist= does not work for you, as a fall back you can try the # remarked out baseurl= line instead. # # [base] name=CentOS-$releasever - Base mirrorlist=http://mirrorlist.centos.org/?release=$releasever&arch=$basearch&repo=os baseurl=http://vault.centos.org/5.11/os/x86_64/ #baseurl=http://mirror.centos.org/centos/$releasever/os/$basearch/ gpgcheck=1 gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-5 #released updates [updates] name=CentOS-$releasever - Updates mirrorlist=http://mirrorlist.centos.org/?release=$releasever&arch=$basearch&repo=updates baseurl=http://vault.centos.org/5.11/updates/x86_64/ #baseurl=http://mirror.centos.org/centos/$releasever/updates/$basearch/ gpgcheck=1 gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-5 #additional packages that may be useful [extras] name=CentOS-$releasever - Extras mirrorlist=http://mirrorlist.centos.org/?release=$releasever&arch=$basearch&repo=extras baseurl=http://vault.centos.org/5.11/extras/x86_64/ #baseurl=http://mirror.centos.org/centos/$releasever/extras/$basearch/ gpgcheck=1 gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-Cent https://www.centos.org/forums/viewtopic.php?t=62130
  10. https://www.webpagefx.com/tools/emoji-cheat-sheet/
  11. Hello Donna Carol,

    Welcome to designhost.gr.

    Feel free to browse our community accessing all sorts of information and getting to know our members.

    Do not hesitate to ask anything in our forums.

    designhost.gr

  12. sed -i 's/utf8mb4_unicode_520_ci/utf8_general_ci/' {SQL FILE}
  13. all done Hi Nick Partsalas, Good news! Version 4.3.4 of Invision Community is now available. Added a filter to view members that have opt-in for bulk mail in the ACP, and an option to opt-out those members Bug fixes Also included: 4.3.3 New features for GDPR compliance: New feature for administrators to download an XML file of all personal information held. New setting to automatically prune IP address records. New option when deleting a member to anonymize content submitted by them. New setting to automatically add links to privacy policies of integrated third party services such as Google Analytics or Facebook Pixel to your privacy policy if they are enabled. Fixes an issue where Calendar events submitted in different timezones to the user may show at the wrong time. Other minor bug fixes and improvements. Learn more about GDPR compliance features in this release
  14. What's new in version 4.3.4 Added a filter to view members that have opt-in for bulk mail in the ACP, and an option to opt-out those members Bug fixes Also included: 4.3.3 New features for GDPR compliance: New feature for administrators to download an XML file of all personal information held. New setting to automatically prune IP address records. New option when deleting a member to anonymize content submitted by them. New setting to automatically add links to privacy policies of integrated third party services such as Google Analytics or Facebook Pixel to your privacy policy if they are enabled. Fixes an issue where Calendar events submitted in different timezones to the user may show at the wrong time. Other minor bug fixes and improvements. Learn more about GDPR compliance features in this release
  15. This is a quick guide on how to install both the Redis PHP extension as well as the daemon via SSH. Installing the Redis daemon: for CentOS 6/RHEL 6 rpm -ivh https://dl.fedoraproject.org/pub/epel/epel-release-latest-6.noarch.rpm rpm -ivh http://rpms.famillecollet.com/enterprise/remi-release-6.rpm yum -y install redis --enablerepo=remi --disableplugin=priorities chkconfig redis on service redis start for CentOS 7/RHEL 7 rpm -ivh https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm rpm -ivh http://rpms.famillecollet.com/enterprise/remi-release-7.rpm yum -y install redis --enablerepo=remi --disableplugin=priorities systemctl enable redis systemctl start redis Installing the Redis PHP extension for all available versions of PHP. Copy and paste the entire block into SSH, don't do line by line. for phpver in $(ls -1 /opt/cpanel/ |grep ea-php | sed 's/ea-php//g') ; do cd ~ wget -O redis.tgz https://pecl.php.net/get/redis tar -xvf redis.tgz cd ~/redis* || exit /opt/cpanel/ea-php"$phpver"/root/usr/bin/phpize ./configure --with-php-config=/opt/cpanel/ea-php"$phpver"/root/usr/bin/php-config make clean && make install echo 'extension=redis.so' > /opt/cpanel/ea-php"$phpver"/root/etc/php.d/redis.ini rm -rf ~/redis* done /scripts/restartsrv_httpd /scripts/restartsrv_apache_php_fpm All done! Check to make sure the PHP extension is loaded in each version of PHP: Copy and paste the entire block into SSH, don't do line by line. for phpver in $(ls -1 /opt/cpanel/ |grep php | sed 's/ea-php//g') ; do echo "PHP $phpver" ; /opt/cpanel/ea-php$phpver/root/usr/bin/php -i |grep "Redis Support" done Output should be: PHP 55 Redis Support => enabled PHP 56 Redis Support => enabled PHP 70 Redis Support => enabled PHP 71 Redis Support => enabled Enjoy! https://help.bigscoots.com/cpanel/cpanel-easyapache-4-installing-redis-and-redis-php-extension
  16. The first time I do sudo on openSUSE I'm always warned with a someway fancy message We trust you have received the usual lecture from the local System Administrator. It usually boils down to these three things: #1) Respect the privacy of others. #2) Think before you type. #3) With great power comes great responsibility. root's password: After the first successful login I won't be warned again. I'd like to be always warned. I find this message someway fancy. Is there any way to be warned like that by sudo prompt? https://superuser.com/questions/500119/keeping-the-fancy-sudo-warning-forever >> Create a file inside /etc/sudoers.d/ You can use this command sudo nano /etc/sudoers.d/privacy Now paste this line into the file. Defaults lecture = always Now close Terminal/Konsole, Reopen it and try to do something with sudo.
  17. Linux command shelf is a quick reference guide that introduce most set of commands used in linux, broadly based on rhel / centos. Commands are divided into 15 sections with description for each commands. The pdf format of all these commands are available for download. Let us know if you have any comments or corrections. You can download the latest version 1.1 of linux command shelf in pdf format. This guide can be used by newbies and also advanced users for reference. You can navigate to each section using the index that is placed below. If you feel hard to understand any commands please let us know. System Hardware Statistics Users File Commands Process Related File Permission Related Network Compression / Archives Install Package Search Login (ssh and telnet) File transfer Disk Usage Directory Traverse 1. SYSTEM $ uname -a => Display linux system information $ uname -r => Display kernel release information (refer uname command in detail) $ cat /etc/redhat_release => Show which version of redhat installed $ uptime => Show how long system running + load (learn uptime command) $ hostname => Show system host name $ hostname -i => Display the IP address of the host (all options hostname) $ last reboot => Show system reboot history (more examples last command) $ date => Show the current date and time (options of date command) $ cal => Show this month calendar (what more in cal) $ w => Display who is online (learn more about w command) $ whoami => Who you are logged in as (example + sreenshots) $ finger user => Display information about user (many options of finger command) 2. Hardware $ dmesg => Detected hardware and boot messages (dmesg many more options) $ cat /proc/cpuinfo => CPU model $ cat /proc/meminfo => Hardware memory $ cat /proc/interrupts => Lists the number of interrupts per CPU per I/O device $ lshw => Displays information on hardware configuration of the system $ lsblk => Displays block device related information in Linux (sudo yum install util-linux-ng) $ free -m => Used and free memory (-m for MB) (free command in detail) $ lspci -tv => Show PCI devices (very useful to find vendor ids) $ lsusb -tv => Show USB devices (read more lsusb options) $ lshal => Show a list of all devices with their properties $ dmidecode => Show hardware info from the BIOS (vendor details) $ hdparm -i /dev/sda # Show info about disk sda $ hdparm -tT /dev/sda # Do a read speed test on disk sda $ badblocks -s /dev/sda # Test for unreadable blocks on disk sda 3. Statistics $ top => Display and update the top cpu processes (30 example options) $ mpstat 1 => Display processors related statistics (learn mpstat command) $ vmstat 2 => Display virtual memory statistics (very useful performance tool) $ iostat 2 => Display I/O statistics (2sec Intervals) (more examples) $ tail -n 500 /var/log/messages => Last 10 kernel/syslog messages (everyday use tail options) $ tcpdump -i eth1 => Capture all packets flows on interface eth1 (useful to sort network issue) $ tcpdump -i eth0 'port 80' => Monitor all traffic on port 80 ( HTTP ) $ lsof => List all open files belonging to all active processes.(sysadmin favorite command) $ lsof -u testuser => List files opened by specific user $ free -m => Show amount of RAM (daily usage command) $ watch df -h => Watch changeable data continuously(interesting linux command) 4. Users $ id => Show the active user id with login and group(with screenshot) $ last => Show last logins on the system (few more examples) $ who => Show who is logged on the system(real user who logged in) $ groupadd admin => Add group "admin" (force add existing group) $ useradd -c "Sam Tomshi" -g admin -m sam => Create user "sam" and add to group "admin"(here read all parameter) $ userdel sam => Delete user sam (force,file removal) $ adduser sam => Add user "sam" $ usermod => Modify user information(mostly useful for linux system admins) 5. File Commands $ ls -al => Display all information about files/ directories(20 examples) $ pwd => Show current directory path(simple but need every day) $ mkdir directory-name => Create a directory(create mutiple directory) $ rm file-name => Delete file(be careful of using rm command) $ rm -r directory-name => Delete directory recursively $ rm -f file-name => Forcefully remove file $ rm -rf directory-name => Forcefully remove directory recursively $ cp file1 file2 => Copy file1 to file2 (15 cd command examples) $ cp -r dir1 dir2 => Copy dir1 to dir2, create dir2 if it doesn't exist $ mv file1 file2 => Move files from one place to another(with 10 examples) $ ln -s /path/to/file-name link-name => Create symbolic link to file-name (examples) $ touch file => Create or update file (timestamp change) $ cat > file => Place standard input into file (15 cat command examples) $ more file => Output the contents of file (help display long tail files) $ head file => Output the first 10 lines of file (with different parameters) $ tail file => Output the last 10 lines of file (detailed article with tail options) $ tail -f file => Output the contents of file as it grows starting with the last 10 lines $ gpg -c file => Encrypt file (how to use gpg) $ gpg file.gpg => Decrypt file 6. Process Related $ ps # Display your currently active processes (many parameters to learn) $ ps aux | grep 'telnet' # Find all process id related to telnet process $ pmap # Memory map of process (kernel,user memory etc) $ top # Display all running processes (30 examples) $ kill pid # Kill process with mentioned pid id (types of signals) $ killall proc # Kill all processes named proc $ pkill processname # Send signal to a process with its name $ bg # Resumes suspended jobs without bringing them to foreground (bg and fg command) $ fg # Brings the most recent job to foreground $ fg n # Brings job n to the foreground 7. File Permission Related $ chmod octal file-name # Change the permissions of file to octal , which can be found separately for user, group and world octal value (more examples) 4 - read 2 - write 1 - execute Example $ chmod 777 /data/test.c # Set rwx permission for owner , rwx permission for group, rwx permission for world $ chmod 755 /data/test.c # Set rwx permission for owner,rx for group and world $ chown owner-user file # Change owner of the file (chown more examples) $ chown owner-user:owner-group file-name # Change owner and group owner of the file $ chown owner-user:owner-group directory # Change owner and group owner of the directory Example $ chown bobbin:linoxide test.txt $ ls -l test.txt -rw-r--r-- 1 bobbin linoxide 0 Mar 04 08:56 test.txt 8. Network $ ifconfig -a # Display all network ports and ip address (set mtu and other all options,ifconfig now in deprecated network command) $ ifconfig eth0 # Display specific ethernet port ip address and details $ ip addr show # Display all network interfaces and ip address(available in iproute2 package,powerful than ifconfig) $ ip address add 192.168.0.1 dev eth0 # Set ip address $ ethtool eth0 # Linux tool to show ethernet status (set full duplex , pause parameter) $ mii-tool eth0 # Linux tool to show ethernet status (more or like ethtool) $ ping host # Send echo request to test connection (learn sing enhanced ping tool) $ whois domain # Get who is information for domain $ dig domain # Get DNS information for domain (screenshots with other available parameters) $ dig -x host # Reverse lookup host $ host google.com # Lookup DNS ip address for the name (8 examples of host command) $ hostname -i # Lookup local ip address (set hostname too) $ wget file # Download file (very useful other option) $ netstat -tupl # Listing all active listening ports(tcp,udp,pid) (13 examples) 9. Compression / Archives $ tar cf home.tar home # Create tar named home.tar containing home/ (11 tar examples) $ tar xf file.tar # Extract the files from file.tar $ tar czf file.tar.gz files # Create a tar with gzip compression $ gzip file # Compress file and renames it to file.gz (untar gzip file) 10. Install Package $ rpm -i pkgname.rpm # Install rpm based package (Installing, Uninstalling, Updating, Querying ,Verifying) $ rpm -e pkgname # Remove package Install from source ./configure make make install (what it is) 11. Search $ grep pattern files # Search for pattern in files (you will this command often) $ grep -r pattern dir # Search recursively for pattern in dir $ locate file # Find all instances of file $ find /home/tom -name 'index*' # Find files names that start with "index"(10 find examples) $ find /home -size +10000k # Find files larger than 10000k in /home 12. Login (ssh and telnet) $ ssh user@host # Connect to host as user (secure data communication command) $ ssh -p port user@host # Connect to host using specific port $ telnet host # Connect to the system using telnet port 13. File Transfer scp $ scp file.txt server2:/tmp # Secure copy file.txt to remote host /tmp folder $ scp nixsavy@server2:/www/*.html /www/tmp # Copy *.html files from remote host to current system /www/tmp folder $ scp -r nixsavy@server2:/www /www/tmp # Copy all files and folders recursively from remote server to the current system /www/tmp folder rsync $ rsync -a /home/apps /backup/ # Synchronize source to destination $ rsync -avz /home/apps linoxide@192.168.10.1:/backup # Synchronize files/directories between the local and remote system with compression enabled 14. Disk Usage $ df -h # Show free space on mounted filesystems(commonly used command) $ df -i # Show free inodes on mounted filesystems $ fdisk -l # Show disks partitions sizes and types(fdisk command output) $ du -ah # Display disk usage in human readable form (command variations) $ du -sh # Display total disk usage on the current directory $ findmnt # Displays target mount point for all filesystem (refer type,list,evaluate output) $ mount device-path mount-point # Mount a device 15. Directory Traverse $ cd .. # To go up one level of the directory tree(simple & most needed) $ cd # Go to $HOME directory $ cd /test # Change to /test directory
  18. http://www.nickpar.dyndns.org/uploads/files/1/Important/linux-cheat-sheet.pdf https://linoxide.com/guide/linux-command-shelf.html
  19. We’d like to share some exciting news about the future of SolusVM. After many successful years as part of the OnApp family of companies, the next chapter has begun for us. Effective June 06th, 2018 Plesk is taking over the assets of SolusVM for an undisclosed amount. With this acquisition, SolusVM will become part of the Plesk product portfolio and development will continue within the agile and highly skilled Plesk R&D team. Plesk is fully committed to both SolusVM partners and customers. Our common goal is to unite our strengths and drive the future growth of the SolusVM suite of tools for your benefit. Together, we will deliver innovative solutions that add significant value for both our partners and end customers. Founded in 1999, Plesk is a leading global WebOps and hosting software company. As of now, Plesk runs on over 380K server installations, hosts +11 million websites and +19 million mailboxes. It’s available in 32 languages, across 140 countries, with over 50% of the top 100 hosting and service providers worldwide partnering with Plesk today. For our valued partners and customers, there will only be one immediate change. As a direct result of the acquisition, all existing SolusVM partnership agreements and customer contracts will be directly assumed by Plesk. SolusVM business continues as usual, and your existing SolusVM accounts and support contacts will remain the same. We would like to sincerely thank you for your business, and your trust in us. We’re looking forward to continuing our partnership and providing you with innovative solutions and services long into the future. We’ve put together a short FAQ that you might find helpful - you can find it at: https://solusvm.com/plesk-faq. In the meantime, if you have any questions about this news, please don’t hesitate to contact us via our support portal. Best Regards, Phill Bandelow and the SolusVM team. visit our website | log in to your account | get support Copyright © SolusVM, All rights reserved.
  20. Are you a member of a busy Facebook Group? Do you find it overwhelming trying to sort through all the posts to find something posted the day before? Are you now missing new posts and only seeing them a few days later? Facebook Groups are tempting to use as they are free to set up but is this the best decision for the future of your business? At the beginning with just a handful of members, things may fun fine. But fast forward to where your group becomes busy with thousands of members posting and reading. Your group becomes overwhelming. You find it hard to locate posts made on previous days and search is of no use. It is getting harder to keep on top of troublesome and spamming members. Worse still, Facebook's changing algorithms mean that your members are not seeing every post you make. You do as Facebook asks and link your page to your group to find that you must now boost posts to reach your members. This is getting to be a very common scenario. Even more worrying are rumours that Facebook is bringing advertising to groups. Will this allow your competitors to target your hard won membership? Will Facebook roll out the "Discover" tab across all continents? This alone has destroyed organic reach for many brands. What would you do if Facebook blocked your account for a week? Would your sales suffer? There is a way to take back control of your membership and secure your business' future. Building your business on your own land is a powerful way of retaining complete control over your community regardless of what happens to Facebook longer term. Created in 2002, Invision Community has always adapted to the changing habits of the internet. Our latest product is clean, modern, mobile ready and equipped to integrate with social media. It can power your conversations, website and shopping cart. It features single click Facebook sign in and tools to promote scheduled content to your Facebook page. We recently wrote why you shouldn't settle for a Facebook Group when building a community. The benefits of an owned Invision Community are: You own your own data. Your data is not mined for Facebook's benefit. Make it yours by branding it your way You're no longer boxed in by the Facebook format Seamless integration to your shopping cart for more monetization opportunities Set up permission levels to better control what your members can see Lets dig in a look at some of the tools you can leverage to make the migration easier. Mobile Ready Invision Community works great on your mobile. It resizes the page perfectly to match whichever device you are using. You don't need to install special apps or mess with themes. It just works out of the box. Facebook Sign In The first thing you'll want to do is turn on Facebook Sign In. This adds the familiar Facebook button right on the sign in page and register form. Clicking this logs them into your new community with their Facebook account. It even imports their profile photo so they are familiar with other members. Make use of embeds A great way to keep incorporating content from your Facebook Group or Page is to use embeds. Post a link to your content on Facebook and it transforms into a rich media snippet. Social Promotions Share your community content with your Facebook Page. Click the "Promote" button on any content item and you can customize the text and images shared. The promotion system offers a full scheduling system much like Buffer or Hootsuite. This is all built in at no extra cost. Find Your Content Unlike a Facebook Group, your Invision Community makes it easy to find older content. A powerful feature is activity streams. These are customizable "feeds" much like the Facebook News Feed but completely editable to you and your members needs. You can even make this the first page your members see for easy content discovery. Use Clubs Clubs allow sub-communities to run inside your main community. Let's look at a real world example. A FitPro has several different fitness products for sale. Each product is a Facebook Group. She posts daily workouts and answers member's questions. Using many groups can be very time consuming to manage. Clubs puts these sub-communities right on the page making it easy to drop in and update. These Clubs can be private and members invited to join allowing full privacy. This is like a closed Facebook group. We're only scratching the surface of what Invision Community can offer you. You can take back control of your membership and be free from the fear that Facebook will change something that will impact your sales. We're experts in this field with 16 years of experience. We've helped grow thousands of communities from the very biggest brands to the smallest of niches. We'd love to talk to you about your needs. https://invisioncommunity.com/news/community-management/why-owning-your-own-community-is-better-than-using-a-facebook-group-r1047/
  21. " Is it not possible to use wildcards in the Exchange 2007 Message Tracking Assistant? My boss asked me to look for all messages from a particular vendor, and so I put her email address in Recipients, then put *@godaddy.com in the Sender field, and clicked Next. Is this not possible? It seems like a functionality you'd expect to be there - am I just formatting something wrong? " - Nope, both the sender and recipient fields can only be complete smtp addresses, it is not possible to search using wildcards. If you want to do that you will need to search the log files manually using something like notepad++ (which can search multiple files at once which should help!). http://technet.microsoft.com/en-us/library/bb124926%28EXCHG.80%29.aspx - From the Exchange Management Shell, run the following command: Powershell Get-MessageTrackingLog -EventId "RECEIVE" -Start "5/1/2015 12:00:00 AM" -End "5/31/2015 11:59:59 PM" -ResultSize unlimited | where {$_.Sender -like "*@godaddy.com"} This will search everyone's mailbox on your Exchange server for all emails from anyone at GoDaddy.com between May 1, 2015 and May 31, 2015 and display them. You can add specific recipients to your command by specifying -Recipients:username@domain.com anywhere before the pipe ("|"). https://community.spiceworks.com/topic/124477-wildcards-in-exchange-2007-message-tracking
  22. http://serverfault.com/questions/194654/how-do-i-find-new-active-directory-accounts-that-have-been-made-in-the-last-90-d http://www.winvistatips.com/list-user-account-creation-date-t754540.html It says: On the server, open a CMD.EXE prompt and type: dsquery * domainroot -filter "(&(objectCategory=Person)(objectClass=User))" -attr distinguishedName sAMAccountName whenCreated -Limit 0 When I did this, I got a list of my AD users with creation date. https://community.spiceworks.com/topic/334893-how-to-check-active-directory-user-account-created-date
  23. from IPS blog This month at Invision Community It's safe to come out. This email isn't about GDPR (well, not really). If you're reading this, then well done. You survived the great GDPR email deluge of 2018 and you're still on our list. And we're thrilled to have you. Right, let's get onto the fun stuff! Invision Community 4.3.3 was released to incorporate some GDPR features to make your compliance a little easier. We also snuck in a fair few bug fixes while we were at it. In this month's team talk, our team share photos and details of their workstations. In our community management series, we answer your GDPR questions, and look at how to successfully migrate your community from another platform. Finally, we take a look at some highlights from the Marketplace. As always, we'd love to hear what you think. Get in touch via our forums, LinkedIn, Facebook or Twitter, or email us at feedback@invisionpower.com. - Matt at Invision Community Latest Release 4.3.3 May 25th Release Notes Latest Product News Invision Community 4.3.3 has been released to add in a few more GDPR features, such as personal data export and a tool to prune IP addresses from older posted content. It's now available from your client center. Company Management How to successfully convert your platform and breathe new life into your community Do you have a community but are looking to move to a more modern and feature rich platform? There's a lot of ways Invision Community can breathe new life into your community. With our engagement features, advanced promotion features and mobile ready responsive themes, your members will love the changes. Read Article Company Management Your GDPR questions answered You've no doubt heard about GDPR by now. It's a very hot topic in many circles. Lots of experts are weighing in on the best approach to take before the May 25th deadline. I wrote about how Invision Community can help with your GDPR compliance back in December. I've seen a lot of posts and topics on GDPR in our community since then. Read Article Community Management How to stop spam in your community We all know what a pain spam can be. We deal with it daily in our inboxes often relying on clever software to filter it out for us. Even worse, some spam is so well disguised that it can fool you into thinking it is a genuine message. You've put in the hard work with your community. You've used the built in social promotion tools and SEO features to get it noticed. Now that it's indexing well with Google, you've become a target. Read Article Team Talk Show us your workstation You have probably spoken to us in support tickets and on our community forums, and you've likely seen our photos. But what about our workstations? What do they reveal about our personalities? Read Article
  24. Good news! Version 4.3.3 of Invision Community is now available. New features for GDPR compliance: New feature for administrators to download an XML file of all personal information held. New setting to automatically prune IP address records. New option when deleting a member to anonymize content submitted by them. New setting to automatically add links to privacy policies of integrated third party services such as Google Analytics or Facebook Pixel to your privacy policy if they are enabled. Fixes an issue where Calendar events submitted in different timezones to the user may show at the wrong time. Other minor bug fixes and improvements. Learn more about GDPR compliance features in this release Also included: 4.3.2 Version 4.3.2 is a small maintenance update to fix issues reported since 4.3.1, including: Promotes non-functional when "Our Picks" disabled. Various emoji fixes, including skintones and mobile issues. Online stats. Numerous IE11 fixes. PayPal billing agreements failing due to lack of address.
×