tag:blogger.com,1999:blog-36577668809213698082024-03-14T11:57:41.693+05:30Jugnu Life :-)ਜੁਗਨੂ ਦੀ ਜਿੰਦਗੀ ...
Spread the light....Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comBlogger405125tag:blogger.com,1999:blog-3657766880921369808.post-16001374975513030832020-06-10T03:27:00.000+05:302020-06-10T03:27:17.615+05:30Maven package org.apache.commons.httpclient.methods does not exist<div>Apache commons http client has two versions</div><div><br /></div><div>https://mvnrepository.com/artifact/org.apache.httpcomponents/httpclient</div><div><br /></div><div>https://mvnrepository.com/artifact/commons-httpclient/commons-httpclient</div><div><br /></div><div>From</div><div><br /></div><div>https://stackoverflow.com/questions/10986661/apache-httpclient-does-not-exist-error</div><div><br /></div><div><div class="post-text" itemprop="text">Solution:</div><div class="post-text" itemprop="text"><br /></div><div class="post-text" itemprop="text">Use older version of the client</div><div class="post-text" itemprop="text"><br /></div><div class="post-text" itemprop="text"><pre style="background-color: #2b2b2b; color: #a9b7c6; font-family: 'JetBrains Mono',monospace; font-size: 9.8pt;"><span style="color: #e8bf6a;"><dependency><br /></span><span style="color: #e8bf6a;"> <groupId></span>commons-httpclient<span style="color: #e8bf6a;"></groupId><br /></span><span style="color: #e8bf6a;"> <artifactId></span>commons-httpclient<span style="color: #e8bf6a;"></artifactId><br /></span><span style="color: #e8bf6a;"> <version></span>3.1<span style="color: #e8bf6a;"></version><br /></span><span style="color: #e8bf6a;"></dependency></span></pre></div></div><div><br /></div><div><br /></div><div><br /></div>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-58078410021302064362020-06-08T09:52:00.000+05:302020-06-08T09:52:03.934+05:30Ubuntu 18.04 customizations<div dir="ltr" style="text-align: left;" trbidi="on">
Ubuntu comes with lots of good options to configure the system.<br />
<br />
Few of the things which I like are mentioned below.<br />
<br />
Enable Gnome Shell extensions and Windows like themes<br />
<br />
https://www.howtogeek.com/353819/how-to-make-ubuntu-look-more-like-windows/ <br />
<br />
<br />
<pre>sudo apt install gnome-shell-extensions gnome-shell-extension-dash-to-panel </pre>
<pre>auso apt install gnome-tweaks adwaita-icon-theme-full</pre>
<pre> </pre>
<div style="text-align: left;">
Install few good extensions </div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
https://itsfoss.com/things-to-do-after-installing-ubuntu-18-04/</div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
To use Gnome shell enable the browser extension and also the host extension.</div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
Once you do that, you will see the toggle button for any gnome extension, right inside the browser Window.</div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
You can also congfigure that in Tweaks application </div>
<pre> </pre>
<pre> </pre>
</div>
Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-25367541034296683832020-06-05T02:11:00.001+05:302020-06-05T02:11:27.293+05:30Gradle Could not create service of type ScriptPluginFactory<div>Error</div><div><br /></div><div>Could not create service of type ScriptPluginFactory using BuildScopeServices.createScriptPluginFactory().<br /></div><div><br /></div><div><br /></div><div>Detailed exception</div><div><br /></div><div>[jj@184fc3b978cc bigtop]$ ./gradlew clean<br /><br />FAILURE: Build failed with an exception.<br /><br />* What went wrong:<br />Could not create service of type ScriptPluginFactory using BuildScopeServices.createScriptPluginFactory().<br />> Could not create service of type CrossBuildFileHashCache using BuildSessionScopeServices.createCrossBuildFileHashCache().<br /><br />* Try:<br />Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.<br /><br />* Get more help at https://help.gradle.org<br /><br />BUILD FAILED in 0s<br /></div><div><br /></div><div>Solution</div><div><br /></div><div>The directory where code was there was owned by root:root</div><div><br /></div><div>Change the ownership back to your user and it should work<br /></div>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-92174363458482863282020-06-01T09:02:00.003+05:302020-06-01T09:19:41.227+05:30Clevo P570WM Ubuntu freeze problem<div dir="ltr" style="text-align: left;" trbidi="on">
TL;DR<br />
<br />
To get Ubuntu running on a Clevo laptop get kernel version 5.6.15+. Here are the steps to <a href="https://www.tecmint.com/upgrade-kernel-in-ubuntu/" target="_blank">upgrade </a>Kernel.<br />
<br />
7 years back in 2013, I bought a P570WM laptop from <a href="https://www.metabox.com.au/" target="_blank">Metabox</a>, I did one big mistake that I invested money on something which was brand new in the market. My goal was to get some good Ubuntu laptop. I thought to go ahead with Clevo based laptop.<br />
<br />
Below were the specs, they might not look impressive, but if you think these are 7 years old, then you will feel they were really good back then.<br />
<br />
<ul style="text-align: left;">
<li><span style="background-color: #fefefe; font-family: "verdana" , sans-serif; font-size: 10.6667px;">Screen type: 17.3" FHD 1920x1080 LED/LCD </span></li>
<li><span style="background-color: #fefefe; font-family: "verdana" , sans-serif; font-size: 10.6667px;">Graphics: Nvidia GTX 780M 4GB GDDR5 video graphics</span></li>
<li><span style="background-color: #fefefe; font-family: "verdana" , sans-serif; font-size: 10.6667px;">Processor: i7-3970X 6-Core 3.5GHz - 4.0GHz 15MB Cache</span></li>
<li><span style="background-color: #fefefe; font-family: "verdana" , sans-serif; font-size: 10.6667px;">RAM memory: 32GB DDR3 1600Mhz RAM </span></li>
<li><span style="background-color: #fefefe; font-family: "verdana" , sans-serif; font-size: 10.6667px;">Primary drive: 1TB 7200 rpm primary hard drive</span></li>
</ul>
<br />
I was very happy with the laptop arrived, but my nightmare began when I installed Ubuntu in it and it froze immediately on the boot.<br />
<br />
Unfortunately, Metabox was of no help, they said they don't support Ubuntu and I was left alone with a massive waste of money of $5.5K machine of no use.<br />
<br />
Fast forward 2020, that laptop was collecting dust in my cupboard and I thought to give it another shot and glad it worked. I am using Ubuntu 18.04 with Kernel 5.6.15 and no custom drivers for the Nvidia card.<br />
<br />
The posts which gave me a hope to keep moving are mentioned below<br />
<br />
<a href="https://edgcert.com/2019/06/03/ubuntu-on-clevo/">https://edgcert.com/2019/06/03/ubuntu-on-clevo/</a><br />
<a href="https://forum.manjaro.org/t/freezes-when-probing-system-on-clevo-n850hk1/60346">https://forum.manjaro.org/t/freezes-when-probing-system-on-clevo-n850hk1/60346</a><br />
<a href="https://askubuntu.com/questions/1068161/clevo-n850el-crashes-freezes-ubuntu-18-04-1-frequently">https://askubuntu.com/questions/1068161/clevo-n850el-crashes-freezes-ubuntu-18-04-1-frequently</a><br />
<br />
Related Kernel bug<br />
<a href="https://bugzilla.kernel.org/show_bug.cgi?id=109051">https://bugzilla.kernel.org/show_bug.cgi?id=109051</a><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br /></div>
Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-42799427979817239942020-05-18T23:31:00.000+05:302020-05-19T23:31:47.331+05:30Upgrading Large Hadoop Cluster<div dir="ltr" style="text-align: left;" trbidi="on">
<div>
Long-time back, I wrote one post on about Migrating Large Hadoop cluster in which I shared my experience about how we did migration between two Hadoop environments. Last weekend, we did another similar activity which I thought to document and share.</div>
<div>
<br /></div>
<h2 style="text-align: left;">
Epilogue</h2>
<div>
We are a big Telco and we have many Hadoop environments and this post is about the upgrade story of one of the clusters we have.</div>
<div>
<br /></div>
<h2 style="text-align: left;">
Many weeks before The Weekend</h2>
<div>
<br /></div>
<div>
For many many days, we were working to do upgrade one of our main Hadoop platforms. Since it was a major upgrade from version 2.6.4 to 3.1.5 HDP stack, it needed a lot of planning and testing. There were many things that helped us to face the D-day with confidence which I wanted to share.</div>
<div>
<br /></div>
<h3 style="text-align: left;">
Practice upgrades</h3>
<div>
We did 3 practice upgrades in our development environment to ensure we know exactly each and every step how it will work and what kinds of issues we can face. A comprehensive knowledge base for all known errors and solutions was made based on this exercise. This document was shared with all team members involved in the upgrade activity so that someone will remember when we see an issue during the real upgrade. It does become extremely challenging when you get an issue and the clock is ticking to bring the cluster back up for Monday workload. We also did one full team upgrade practice run so that all team members know what steps and sequences are involved to get into a real one and everyone gets a feel of it.</div>
<div>
<br /></div>
<h3 style="text-align: left;">
Code changes</h3>
<div>
We did all the code changes required to ensure our existing applications can run comfortably in the new platform stack. Testing was done in the development environment was stood up with new stack versions. We opened that to all applications teams and use cases for testing the work they do.</div>
<div>
<br /></div>
<h3 style="text-align: left;">
Meeting the pre-requite for upgrade</h3>
<div>
One of the challenges we have is a massive amount of data. Being a Telco company, our network feeds can fill in cluster very quickly. We had to keep strong control over what data comes in and what queries users run to keep the total used cluster storage under 85%, a single wrong user query can fill in a cluster within hours. Our cluster is of decent size around 1.8 PB. So, moving the data when we are overusing HDFS to some other environment is also a normal flow for us.</div>
<div>
<br /></div>
<div>
<br /></div>
<h2 style="text-align: left;">
One week before The weekend</h2>
<h3 style="text-align: left;">
Imaginary upgrade</h3>
<div>
We did an exercise in which we brainstormed a fictitious upgrade and tried to get into the mindset of what steps and sequences we will do to do an upgrade. We listed every minor thing which came to our mind right from raising change request to closing change request after the competition of upgrade. This imaginary exercise helped us to bring to our attention many things that were not planned earlier and allowed us to line our ducks into a make a perfect order of steps to be executed.</div>
<div>
<br /></div>
<h3 style="text-align: left;">
Applications upgrade and use case teams</h3>
<div>
In a large shared cluster environment finding all job dependencies and applications that are impacted, is a challenge. We started sharing bulk communication with all users of the platform for the planned upgrade 1 month in advance so that we get attention for all users eventually and applications that run on top of the platform to remind them about upcoming downtime for the system.</div>
<div>
<br /></div>
<h3 style="text-align: left;">
Data feeds redirection</h3>
<div>
Many data feeds inside the Telco space are very big. We have the opportunity to capture them once only and if we don’t, we lose that data. To prepare for the downtime we planned for the redirection for the same to the alternative platform with a view to bring them back to the main cluster post-upgrade. This exercise needs attention and proper impact analysis to find if feeds can be lost permanently, or we can grab them from the source later down in the future.</div>
<div>
<br /></div>
<h3 style="text-align: left;">
The time roaster</h3>
<div>
Few days before the upgrade we made a timeline view of the upgrade weekend. The goal was we can bring people in and out during the weekend giving them rest as required. We divided into people who come before upgrade into the picture to redirect and stop data feeds, people who do upgrade, people who come into picture post-upgrade to resume jobs, and stop data feed redirection. Besides the above group, we also had a group of people to act as a beta tester for testing all user experience items over the weekend This group structure gave a clear idea when people are entering the scene and deliver what is expected from them</div>
<div>
<br /></div>
<h2 style="text-align: left;">
The Weekend</h2>
<div>
<br /></div>
<h3 style="text-align: left;">
Friday</h3>
<div>
We divided the upgrade into 8 different stages and decided to do a split for the whole upgrade with a goal of doing the Ambari upgrade on Friday and doing as much as possible on Friday from the subsequent stages. Ambari upgrade was easy and we did not hit any blocker and we were done with it within our planned time.</div>
<div>
<br /></div>
<h3 style="text-align: left;">
Saturday and Sunday</h3>
<div>
Our original estimate for the HDP and HDF upgrade based on my past experiences of upgrades was around 20 hours. But due to 3 technical issues we faced our timelines got pushed by 15 hours. Cloudera on-call engineers were very responsive to assist us with those problems. Hadoop is a massive beast, no single person can know all the things, so having access to SMEs from Cloudera when we needed was a massive morale booster for us. It was like we have someone to call if we need to, and they did jump in to resolve all the blockers we got. So, a massive thank you to the Cloudera team.</div>
<div>
<br /></div>
<h2 style="text-align: left;">
Credits</h2>
<div>
<br /></div>
<h3 style="text-align: left;">
Collaboration and COVID</h3>
<div>
This upgrade has been different for us. Due to COVID like all companies worldwide we have been working remotely for the past many weeks. Without giving credit to Microsoft Teams for this, it will not be fair. Microsoft Teams made is possible since day 1 of work from home environment that we could work effectively. Our team of core 4 people involved in the upgrade was all hooked into one Team's meeting session for 3 days. We used screen sharing, document sharing features of Teams to make it easier for us to get the job done.</div>
<div>
<br /></div>
<h3 style="text-align: left;">
Kids and families</h3>
<div>
Lastly, it's worth mentioning the patience of our families who brought all meals next to the computer so that we could work and taking care of kids during long working hours. With Teams meeting broadcasting for many hours we could hear each other's kids (except for 1, who is a Bachelor :) ) shouting and trying to grab attention and wanting us to move away from the keyboard. With this upgrade over now we are back spending more and more time with them.</div>
<div>
<br /></div>
<h2 style="text-align: left;">
Weekend + 1 Monday</h2>
<div>
<br /></div>
<div>
The upgrade has been successful, project teams, users are slowly coming back live on the platform. The users are reporting issues they are facing, and we are incrementally fixing them. Data has started to flow back into the platform, with flood gates of massive feeds to be opened later during the week and things are slowly getting back to normal. Our users are excited with lots of new functionality this upgrade brings and I am proud of what we have achieved.</div>
<div>
<br /></div>
<div>
Massive planning, practice exercise has delivered a good outcome for us. We have missed planning for a few things, but we will learn from them, that is what life is, isn’t it?</div>
<div>
Until next upgrade, goodbye.</div>
<div>
<br /></div>
<div>
Thank you for reading. Please do leave a comment below</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
</div>
Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-43685900466264480852019-11-23T12:42:00.002+05:302019-11-23T12:42:24.065+05:30Replace ssh key of the AWS EC2 machine<div dir="ltr" style="text-align: left;" trbidi="on">
You can follow the below steps to change the SSH key for a AWS EC2 machine.<br />
<br />
Step 1)<br /><br />Check that you have existing ssh key working and we can log in to the machine using it. You can also directly login via a new function in AWS console.<br /><br />Step 2)<br /><br />Generate a new SSH key via Amazon Web Console<br /><br />Step 3)<br /><br />Get the public key from it. Using the command below<br /><br />ssh-keygen -y -f ~/Downloads/second.pem<br /><br />If working on Windows system using this https://www.puttygen.com/convert-pem-to-ppk<br /><br /><br />Step 4)<br /><br />Login to the machine and edit the file.<br /><br />vi ~/.ssh/authorized_keys<br /><br />Add the new public key and check that you are able to login with the new key<br /><br />Step 5) <br /><br />Change permission of new key to 400 and try to login<br /><br />Step 6)<br /><br />If login is successful delete the old key from authorized keys file<br /><br />
<br /></div>
Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-60311439235572708842019-11-23T12:32:00.003+05:302019-11-23T12:58:20.533+05:30Tips for AWS Professional certification exams<p dir="ltr">
I recently gave AWS Professional exam and found certain things that are useful to be documented and share with all.
<br>This post is generally about non-technical things. For technical pointers please see the other related posts.
<br>AWS professional exams not only test your AWS knowledge but, also test our mental and physical strength. You have to sit there for 3 hours staring at the screen and read large large texts in questions and answers.
<br>Below are the things which can be useful for you
<br><strong>Stretch at regular intervals<br></strong>After every 30-45mins just stand up and stretch your legs, or just stretch your hands while sitting, this will keep your blood circulation moving and helps to keep you active.
<br><strong>Time management<br></strong>AWS professional exams are the fight against time, you need to keep a constant tab on your clock which ticks on the screen. As a rough guideline try to complete 30 questions in the first 60mins, 30 in the next 60mins and keep the remaining 15 questions in 30mins. Try to spend just 2 mins per question and mark any thing which you are doubtful for a review and go ahead.
<br>Time left >>>>> Questions pending<br>170 >>>>> 75<br>110 >>>>> 45<br>50 >>>>> 15<br>20 >>>>> 0
<br>In the last 20mins try to review any questions which you have marked during the first pass.
<br><strong>Question reading<br></strong>Read what is really asked first. Don’t start reading the question from the top to bottom and then read the answers. Don’t do this.
<br>Just to make it clear, see the example question below from the AWS Solutions Architect Professional exam.
<br>Question.
<br>Your company’s on-premises content management system has the following architecture:
</p><ul>
<li>Application Tier – Java code on a JBoss application server</li><li>Database Tier – Oracle database regularly backed up to Amazon Simple Storage Service (S3) using the Oracle RMAN backup utility</li><li>Static Content – stored on a 512GB gateway stored Storage Gateway volume attached to the application server via the iSCSI interface</li>
</ul><p><strong><u>Which AWS based disaster recovery strategy will give you the best RTO? </u></strong></p><p>A) Deploy the Oracle database and the JBoss app server on EC2. Restore the RMAN Oracle backups from Amazon S3. Generate an EBS volume of static content from the Storage Gateway and attach it to the JBoss EC2 server.</p><p>B) Deploy the Oracle database on RDS. Deploy the JBoss app server on EC2. Restore the RMAN Oracle backups from Amazon Glacier. Generate an EBS volume of static content from the Storage Gateway and attach it to the JBoss EC2 server.</p><p>C) Deploy the Oracle database and the JBoss app server on EC2. Restore the RMAN Oracle backups from Amazon S3. Restore the static content by attaching an AWS Storage Gateway running on Amazon EC2 as an iSCSI volume to the JBoss EC2 server.</p><p>D) Deploy the Oracle database and the JBoss app server on EC2. Restore the RMAN Oracle backups from Amazon S3. Restore the static content from an AWS Storage Gateway-VTL running on Amazon EC2<br>----<br>When you see this question, the first thing which you should read is the text underlined. This will give you an idea of what is really needed. Then quickly go up and read the full questions and start reading the answers options. Keep on eliminating the answers till to are able to find the correct answer as per what is asked for the underlined text.</p><p dir="ltr"><strong>Elimination<br></strong>While reading keeps on eliminating the wrong answers and iterate until you are able to reach the most suitable answer as per the ask of the question. If you are stuck with a choice of 2 and not able to decide on the final one, mark that question for review, randomly mark one choice as the answer, note on the notepad provided with question number and choices for which the final battle is to be decided. You can come back to this question later at the end of the exam if time permits. This happens very rarely that you will get time to review. So, we have to mark something in the first pass and then later decide if we get time in the review.</p><p><br><strong>Water bottle</strong>A clear water bottle is allowed, keep it next to you and drink a sip when your brain starts steaming in the middle of the exam. It will happen that brain will be jammed in the middle at the regular intervals and you will need some fuel to keep it going :)</p><p>Thank you for reading. I am done with AWS Devops Professional and now preparing for the SA Pro. What are your tips for the AWS professional exams?<br></p>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-7957280298633051662019-11-16T22:49:00.001+05:302019-11-16T22:49:51.628+05:30How to pass AWS Devops Engineer Professional Exam<p dir="ltr">Hi,</p><p>I recently cleared AWS Devops Engineer Professional exam.</p><p>Below is what I did and hopefully can be helpful to you as well for the exam.</p><p><strong>Course:</strong><br></p><ul><li>Stephane Udemy course. I did everything he told. Readings (He suggests lots of things to read) + Watched videos 2 times and labs. I did all 3 Associate exams also following Stephane. So, he does assume that we know the basics.</li><li>AWS Official Devops exam readiness. This course tells about how to approach the question and the test. I highly suggest you to take this and it is a free course.</li></ul><p><strong>Practice tests </strong><br></p><ul><li>Whizlabs practice tests. My average was around 72% and 87% in the final exam.</li><li>Udemy <a href="http://https//www.udemy.com/course/aws-certified-devops-engineer-professional-practice-exams-/" target="_blank">https://www.udemy.com/course/aws-certified-devops-engineer-professional-practice-exams-/.</a> My average was around 70%</li><li>AWS Official practice test. I scored only 60% in the practice test</li></ul><p dir="ltr">This exam tests your time management. So keep an eye on the watch on your side. Try to finish 30 questions in first hour, 30 in second hour and 15 in the remaining time plus for the questions you marked for the review. I was very slow and managed to finish the exam only 5 mins to spare. <br><br>Good luck</p>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comtag:blogger.com,1999:blog-3657766880921369808.post-8311604779503991532019-10-31T18:56:00.002+05:302019-11-16T22:51:40.820+05:30How to pass AWS Certified Developer exam<div dir="ltr" style="text-align: left;" trbidi="on">
I cleared AWS Certified Developer Associate exam with 968 scores. Below is my blueprint for success which you can follow.<br />
<br />
Study material<br />
<ul style="text-align: left;">
<li>Udemy Stephane Maarek course</li>
<li>Linux Academy course </li>
</ul>
Practice tests <br />
<ul style="text-align: left;">
<li>Whizlabs</li>
<li>Udemy Stephane Maarek practice tests </li>
</ul>
If you are short of time, then just do the Stephane Maarek course and practice with Whizlabs and Udemy Stephane Maarek practice test.<br />
<br />
Good luck</div>
Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-17539531475144238552019-04-14T22:12:00.004+05:302019-04-14T22:12:57.251+05:30Patterns<div dir="ltr" style="text-align: left;" trbidi="on">
Architectural<br />
<br />
https://en.wikipedia.org/wiki/Architectural_pattern <br />
<br />
<br />
Software<br />
<br />
https://en.wikipedia.org/wiki/Software_design_pattern<br />
<br /></div>
Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-71687164092524585922019-04-14T22:10:00.001+05:302019-04-14T22:10:17.044+05:30Convert webpage to pdf using Python<div dir="ltr" style="text-align: left;" trbidi="on">
If you dont want to use Python then one easy way is to use website https://www.web2pdfconvert.com/<br />
<br />
If you want to use Python then see the library below<br />
<br />
https://pypi.org/project/pdfkit/<br />
<br />
<br /></div>
Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-69502731025751336822018-11-09T09:46:00.001+05:302018-11-09T09:50:17.403+05:30Gradle proxy settings<p dir="auto">If you are behind the corporate proxy, you can configure Gradle to use the proxy by following settings.</p><p dir="auto">Create a file called ~/.gradle/gradle.properties</p><p dir="auto">Generally is in «USER_HOME»/.gradle/gradle.properties, where «USER_HOME» is your home directory.<br>
That’s typically one of the following, depending on your platform:<br>
C:\Users\<username> (Windows Vista & 7+)<br>
/Users/<username> (macOS)
<br>/home/<username> (Linux)</p><p dir="auto">Add the settings related to proxy.</p><p dir="auto">systemProp.http.proxyHost=<a href="http://www.somehost.org/" target="_blank">www.somehost.org</a><br>systemProp.http.proxyPort=8080<br>systemProp.http.proxyUser=userid<br>systemProp.http.proxyPassword=password<br>systemProp.http.nonProxyHosts=*.nonproxyrepos.com|localhost</p><p dir="auto">Also add the HTTPS proxy settings</p><p dir="auto">systemProp.https.proxyHost=<a href="http://www.somehost.org/" target="_blank">www.somehost.org</a><br>systemProp.https.proxyPort=8080<br>systemProp.https.proxyUser=userid<br>systemProp.https.proxyPassword=password<br>systemProp.https.nonProxyHosts=*.nonproxyrepos.com|localhost</p><p dir="ltr">If your proxy is a CNTLM then you can also configure</p><p dir="ltr">* Set the http.proxyUser system property to a value like domain/username.<br>• Provide the authentication domain via the http.auth.ntlm.domain system property.<br></p>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comtag:blogger.com,1999:blog-3657766880921369808.post-71829104610539235692018-09-15T15:59:00.001+05:302018-09-15T15:59:17.880+05:30 Class JavaLaunchHelper is implemented in both<p dir="auto">I was getting the below error which compiling my code.</p><p dir="auto">objc[3789]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/bin/java (0x1095e64c0) and /Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/jre/lib/libinstrument.dylib (0x1096724e0). One of the two will be used. Which one is undefined.</p><p dir="auto">Solution.</p><p dir="auto">Upgrade JDK to the latest version from http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html</p><p dir="auto">Details related are also discussed at https://stackoverflow.com/questions/18794573/class-javalaunchhelper-is-implemented-in-both-libinstrument-dylib-one-of-th<br></p>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comtag:blogger.com,1999:blog-3657766880921369808.post-61842983951783723092018-09-08T05:36:00.001+05:302018-09-08T05:36:31.556+05:30How to fix when Nexus 5 cannot detect Wifi SSID<p dir="auto">Suddenly my phone today could not detect the WiFi SSID for my home router. Strangely all other devices at my home are able to connect.</p><p dir="auto">I was not able to figure out why it is the case and I restarted my Router and Phone. But, it did not help.</p><p dir="ltr">I logged into router and checked what is the channel of current wifi connection. It was set to automatic and its value was 12.</p><p dir="ltr">I found the issue with my phone plus higher channels. <br>https://www.reddit.com/r/Nexus5/comments/31ljh5/connect_to_wifi_channel_12_or_13/</p><p dir="ltr">I manually set the channel to a lower number (2) and then my phone was able to detect and connect to the Wifi</p>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comtag:blogger.com,1999:blog-3657766880921369808.post-38447817806176683352018-06-30T07:53:00.001+05:302018-06-30T07:53:59.977+05:30How to get Wunderlist list of all tasks in different lists<p dir="auto">Wunderlist has become my default task management tool.</p><p dir="auto">I was looking for a way to see on a single page the list of all tasks, in the all the lists.</p><p dir="auto">After lot of searching, I found a way inside Mac application of Wunderlist.</p><p dir="auto">If you want to see list of all tasks on Mac (might be in other OS also)</p><ul><li>Click Open preferences</li><li>Click on Smart List</li><li>Check the option next to ALL and make it Auto</li><li>Save and then you should see list of all tasks inside the menu option All</li></ul>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-69078279836305372352018-06-30T07:49:00.001+05:302018-06-30T07:49:18.313+05:30How to prettify JSON in VS Code Editor<p dir="auto">I have been using VS code editor from 1 month now. I am big fan of it and now it has become the default editor for me.</p><p dir="auto">Here is a tip if you want to prettify son in VS code.</p><p dir="auto">Download the extension <span>Prettify JSON </span></p><p dir="ltr"><br>Install and Reload the editor</p><p dir="ltr">From the Command Palette, type <span>Prettify JSON and hit enter<br><br></span></p><p dir="ltr">If you have JSON in your editor, it would shown now in pretty format.</p><p dir="ltr">Enjoy using VS Code</p>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comtag:blogger.com,1999:blog-3657766880921369808.post-23801995977461286682018-04-28T09:03:00.003+05:302018-04-28T09:03:50.533+05:30How to install libevents on Linux<div dir="ltr" style="text-align: left;" trbidi="on">
Dowload the library<br />
<br />
<br />
wget https://github.com/libevent/libevent/releases/download/release-2.1.8-stable/libevent-2.1.8-stable.tar.gz<br />
tar xzf libevent-2.1.8-stable.tar.gz<br />
cd libevent-2.1.8-stable<br />
./configure --prefix=/opt/libevent # Change path as required<br />
make<br />
make install<br /></div>
Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-28726906067383322912017-08-26T08:14:00.000+05:302017-08-26T08:14:05.452+05:30How to get internet access inside Jupyter notebook<div dir="ltr" style="text-align: left;" trbidi="on">
Create a file<br />
<br />
vi ~/.ipython/profile_default/startup/00-startup.py<br />
<br />
Add the following inside it<br />
(Change proxy details as per your own environment)<br />
<br />
import sys,os,os.path<br />
os.environ['HTTP_PROXY']="http://127.0.0.1:3128"<br />
os.environ['http_proxy']="http://127.0.0.1:3128"<br />
os.environ['HTTPS_PROXY']="http://127.0.0.1:3128"<br />
os.environ['https_proxy']="http://127.0.0.1:3128"<br />
<br />
# Test<br />
<br />
Inside the notebook cell<br />
<br />
import requests<br />
requests.get("http://google.com")</div>
Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-40166551911459858492017-01-12T05:30:00.001+05:302017-01-12T05:30:13.814+05:30Atom editor proxy<p dir="auto">To use Atom editor behind proxy , use the below. Restart Atom if needed</p><p dir="auto"></p><pre><code>apm config set strict-ssl false
apm config set https-proxy http://127.0.0.1:3128</code></pre><br><p></p><p dir="auto">Change proxy as per your requirements</p>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comtag:blogger.com,1999:blog-3657766880921369808.post-90904883097230731242016-12-12T02:47:00.005+05:302016-12-12T02:53:43.484+05:30Use Kafka command line with Kerberos<pre><code>kdestroy
kinit -k -t myprincipal.keytab myprincipal/HOST.com
export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/etc/kafka/conf/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-console-consumer.sh \
--zookeeper zookeeper.com:2181 \
--topic topicname \
--from-beginning \
--security-protocol SASL_PLAINTEXT</code></pre>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com0tag:blogger.com,1999:blog-3657766880921369808.post-46409914959259104592016-12-12T02:47:00.004+05:302016-12-12T02:53:17.822+05:30Login to Container in Kubernetes<p dir="auto">Step 0</p><p dir="auto">Get all pods</p><p dir="auto"><code>kubectl get po --all-namespaces</code></p><p dir="auto">Find the pod name in which your container is running</p><p>Step 1</p><p dir="auto">Get the container name from running pod</p><pre><code>kubectl describe po pod_name --namespace my_namespace
</code></pre><p><br></p><p dir="auto">Example output</p><blockquote>Name: pod_name<br>Namespace: my_namespace<br>Node: docker-host-03/10.1.3.115<br>Start Time: Wed, 23 Nov 2016 16:57:15 +1100<br>Labels: name=<strong>my_container_name</strong>,pod-template-hash=2802333548<br>Status: Running<br>IP: 10.20.71.4</blockquote><p dir="auto"><br>Note the label with value name</p><p dir="auto">Step 2</p><p dir="auto">Login to container</p><p dir="auto">Example</p><p dir="auto"><code>kubectl exec -it pod_name --namespace my_namespace -c my_container_name bash</code><br></p>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comtag:blogger.com,1999:blog-3657766880921369808.post-23233511081035026902016-08-21T07:05:00.002+05:302016-08-21T11:29:59.867+05:30Why does my LinkedIn inbox always show one unread message ?<p dir="ltr">Even if I have read all of my Linkedin messages , it still shows one unread message. Strange but true. <br>I am not sure what is the problem but i can tell how i fixed that.</p><ol><li>Click on any message that you have already read.<br></li><li>Mark it as UNREAD<br></li><li>Click on another message that you have already read.<br></li><li>Mark it as UNREAD<br></li></ol><p><br></p><p dir="ltr">See , if you notice any change in unread messages indicator ?</p><p dir="ltr">For me it changed correctly and then I read both messages one by one and it made my all messages as Read.</p><p dir="ltr">Have great time using Linkedin :)</p>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comtag:blogger.com,1999:blog-3657766880921369808.post-41106756252553540312016-08-20T10:05:00.001+05:302016-08-20T10:05:57.991+05:30Websites for learning Regular expressions<p dir="ltr">Regular expressions are like swiss knife which can do lot of work in small amount of code.</p><p dir="ltr">However knowing them deeply is very important to write code which is optimal.</p><p dir="ltr">I found below good websites for them while using and learning about them.</p><p dir="ltr">A very good website to have visual debugging for Regular expressions<br><a href="http:// https://www.debuggex.com/ " target="_blank">https://www.debuggex.com/</a></p><p dir="ltr">Website to create and test regular expressions<br><a href="http:// http://regexr.com/ " target="_blank">http://regexr.com/</a><br><a href="http:// https://regex101.com/ " target="_blank">https://regex101.com/</a><br></p><p dir="ltr"><br>A good website for Regular expressions tutorials<br><a href="http:// http://www.rexegg.com/" target="_blank">http://www.rexegg.com/</a><br></p>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comtag:blogger.com,1999:blog-3657766880921369808.post-42057409998372467442016-07-16T14:17:00.002+05:302016-07-16T14:44:56.203+05:30SBT Native Packager Building Rpm with Python files<div dir="ltr" style="text-align: left;" trbidi="on"><div dir="ltr">I was working on project in which we were having Airflow python code packaged as part of RPMs. By default rpmbuild process compiles the python files which Airflow server does not likes when it seem them. Airflow likes to compile themselves. So rpm job was to ship files which raw.</div><div dir="ltr">Here is complete run sheet to make rpm files using Sbt native packager and disable java jar recompress and python compilation.</div><div dir="ltr">Edit to file <code>project > plugins.sbt</code> and add the plugin</div><code>addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.1.1")</code><br />
<div dir="ltr">Disable Java jar repackaging and Python compilation</div><div dir="ltr">Add the following file <code>src/main/rpm/pre</code></div><div dir="ltr">The rpm pre scriplet is executed just before rpm is installed. Read more about rpm scripts at this <a href="http://the%20%25pre%20scriptlet%20executes%20just%20before%20the%20package%20is%20to%20be%20installed./" target="_blank" title="Rpm scriptlets">link</a>.</div><pre>%global __os_install_post %(echo '%{__os_install_post}' | sed -e 's!/usr/lib[^[:space:]]*/brp-java-repack-jars[[:space:]].*$!!g' | sed -e 's!/usr/lib[^[:space:]]*/brp-python-bytecompile[[:space:]].*$!!g')</pre><div dir="ltr"><br />
The above script is removing the <code>bra-java-repack-jars</code> and <code>brb-python-bytecompile</code> process from pre step of RPM installation.</div>To make the pre file work add the following in your build configuration file<br />
<pre>import RpmConstants._
maintainerScripts in Rpm := maintainerScriptsAppendFromFile((maintainerScripts in Rpm).value)(
Pre -> (sourceDirectory.value / "main" / "rpm" / "pre")
)</pre><div dir="ltr"><br />
The complete <code>rpm.sbt</code> is shown below</div><pre class="brush: scala">import sbt.Keys._
import NativePackagerHelper._
enablePlugins(UniversalPlugin)
enablePlugins(RpmPlugin, RpmDeployPlugin)
maintainer in Linux := "admin@mail.com"
packageSummary in Linux := "Package code"
packageDescription := s"Application ${name.value}"
rpmVendor := "My company"
version in Rpm <<= version { (v: String) =>
v.trim.replace("-SNAPSHOT", s".${System.currentTimeMillis}")
}
rpmLicense := Option("Apache")
rpmObsoletes := Seq(s"${name.value}")
// This does not work
rpmBrpJavaRepackJars := false
defaultLinuxInstallLocation := "/var/lib/airflow"
// This allows to override the install location using RPM prefix
rpmPrefix := Some(defaultLinuxInstallLocation.value)
import RpmConstants._
// Remove all jars
mappings in Universal := (mappings in Universal).value.filterNot{
case (file, fname) => fname.endsWith(".jar")}
// Add fat jar
mappings in Universal += {
val fatJar = (assembly in Compile).value
fatJar -> s"${name.value}.jar"
}
// Copy contents of dags folder
mappings in Universal ++= directory("src/main/dags")
// Add version.txt file
mappings in Universal += {
val file = target.value / "version.txt"
IO.write(file, s"${(version in Rpm).value}")
file -> "version.txt"
}
import RpmConstants._
maintainerScripts in Rpm := maintainerScriptsAppendFromFile(
(maintainerScripts in Rpm).value)(
Pre -> (sourceDirectory.value / "main" / "rpm" / "pre")
)
publishTo <<= version { (v: String) =>
val artifactory = "https://artifactory.com/"
if (v.trim.endsWith("SNAPSHOT"))
Some("Artifactory Realm" at artifactory + "artifactory/rpm-snapshots")
else
Some("Artifactory Realm" at artifactory + "artifactory/rpm-releases/")
}</pre><div dir="ltr"><br />
Build the rpm using sbt command</div><div dir="ltr"><code>rpm:packageBin</code></div><div dir="ltr">To publish the rpm use</div><div dir="ltr"><code>rpm:publish</code></div><div dir="ltr">References</div><div dir="ltr"><a href="http://www.rpm.org/max-rpm/s1-rpm-specref-scripts.html" target="_blank">http://www.rpm.org/max-rpm/s1-rpm-specref-scripts.html</a></div></div>Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.comtag:blogger.com,1999:blog-3657766880921369808.post-1457691498207868222016-06-14T05:04:00.000+05:302018-05-19T05:18:23.384+05:30How to configure SSL/TLS for Jenkins <div dir="ltr" style="text-align: left;" trbidi="on">
Configuring SSL/TLS AD signed certificate for Jenkins
<br />
<br />
Enable only TLS 1.2
<br />
<br />
JENKINS_JAVA_OPTIONS="-Dhttps.protocols=TLSv1.2 -Djava.awt.headless=true"
<br />
<br />
Configure the certificate
<br />
<br />
Covert and export pkcs12 version to import into keystore
<br />
<br />
<br />
sudo openssl pkcs12 -inkey /var/lib/jenkins/ssl/myhost.key -in /var/lib/jenkins/ssl/myhost.cer -export -out /var/lib/jenkins/ssl/myhost.pkcs12
<br />
sudo keytool -importkeystore -srckeystore /var/lib/jenkins/ssl/myhost.pkcs12 -srcstoretype pkcs12 -destkeystore /var/lib/jenkins/ssl/jenkins.jks
<br />
<br />
Enter some password where ever it asks. (I have used jenkins)
<br />
<br />
Edit the /etc/sysconfig/jenkins with following
<br />
<br />
JENKINS_ARGS="--httpsKeyStore=/var/lib/jenkins/ssl/jenkins.jks --httpsKeyStorePassword=jenkins --httpsPort=8080"</div>
Jugnuhttp://www.blogger.com/profile/11554798858213664177noreply@blogger.com1