Get startedGet started for free

Lab Review: Cloud Storage

1. Lab Review: Cloud Storage

In this lab, you learn to create and work with buckets and objects, and apply the following Cloud Storage features: Customer-supplied encryption keys, Access control lists, Lifecycle management, Object versioning, Directory synchronization, and cross-project resource sharing using IAM. Now that you're familiar with many of the advanced features of Cloud Storage, you might consider using them in a variety of applications that you might not have previously considered. A common, quick, and easy way to start using Google Cloud is to use Cloud Storage as a backup service. You can stay for a lab walkthrough, but remember that Google Cloud's user interface can change, so your environment might look slightly different. Welcome to the walk-through of the Cloud Storage Lab. At this point, I've already started the lab in Qwiklabs and I am logged into the GCP console using the username and password that was provided by Qwiklabs for me to log in to the GCP console. So the first task is preparation. I'm going to create a bucket in here. When I go to create a bucket, it specifically tells me that I should be using a globally unique ID. So I'm going to use my project ID, which is pretty unique. I'm going to call it myproj- and then my project ID, and it's telling us multi-regional. So storage class is multi-regional, and then it's telling me access control is set object-level and bucket-level permissions, and I'm going to hit create. So at this point, you can now go back to the lab page, and you can hit check my progress, and you should get a check mark in five points that you created the Cloud Storage Bucket. Next step is downloading a file. So I'm going to start Cloud Shell so I can do the curl command, and the first thing I'm going to do is I'm going to set an environment variable to the bucket name of the bucket I just created, just for ease of copy paste of commands. Export bucket name one equals and the bucket name. If I want to verify that that worked, I'm going to do an echo dollar sign, and the variable name to make sure that it got set correctly, and there it is. So now I'm going to download a file, which is just a publicly available Hadoop documentation, HTML file, and if I do an ls, I can see there's my setup.html and I am now going to copy it a couple times to make a setup two and a setup three. If I do an ls, I should see three files. There they are. So the second task is ACLs. We're going to copy this file into the bucket and then configure the access control list for it. So the first one is gsutil command, where I am copying setup.html into my bucket. Once it's copied, I then want to get the default access list that has been assigned to setup.html, which is based on the bucket because that's how we set it. Then right here, I piped it into acl.txt, and now I'm going to cut that, and we can see all of the permissions that had been assigned. So now I want to set the permissions to private. So I'm going to set it to private, and then in order to see it, I'm going to pipe it into acltwo.txt, and then cut that file, and you can see it's now set to private. Update the access list to make the file publicly readable by running the following command, and then I'm going to pipe it into aclthree, so that I can verify what that looks like. You can see it is readable by all users. This is another check point in a lab where you can hit check my progress and in this case is checking if you properly made that file publicly readable. So now I'm going to verify in my bucket using the console that my file is there and that is publicly viewable, and you can tell that based on this little icon and the public link that says that it's accessible to the public. So now in Cloud Shell, I'm going to remove the setup.html in my local Cloud Shell Instance. There it is. Let me remove it from the search here. If I do an ls, I'll see setup two and setup three, but not setup. You can see it got deleted. Let's say I accidentally deleted that from my Cloud Shell Instance, but now I want the copy that was in the bucket back on my local Cloud Shell. So I could just copy from the bucket to my local Cloud Shell, and if I do an ls again, I'll see all three setup files. There they are. The third task is to generate a customer supplied encryption key. To create the key, I'm going to run this command, and that's going to give me some output, and then I can copy this. But first, I'm going to see if I have a boto file. I'm going to do ls-al, and I do not see a boto file. So what I'm going to do is I'm going to run gsutilconfig-n, and then I'm going to do ls-al, and I should now see a boto file. There it is. So I'm going to do a nano.boto, and then I'm going to find the encryption key field, which I'm going to exit back out because I didn't not copy the key that I created, which I need. That is right here. Let me copy that, and let me go back to nano, and let me find the line with encryption underscore key. Could need to expand this because it's very hard to see. See decryption key here is encryption key. I'm going to uncomment this, and then I'm going to paste in my key here. I'm going to press control l, write that file, and then control x to exit nano. So now that I've set that up, I am going to upload the remaining setup two and setup three into the bucket. There's one, and there's the other. Now back in the console, let's go down, I'm going to refresh the bucket. I can see both of these files, and it shows that they are encrypted by a customer supply key. So this is another opportunity to check my progress and make sure I got the points for doing that step. Now what I'm going to do is I am deleting my local files by running remove setup star. So it's going to delete setup, setup two, and setup three. Now I am going to copy the files down from the bucket again, and if I want to cut the encrypted files to see whether I need them back, you can see there they are, and I successfully was able to bring them back even though they're encrypted. So now I'm going to move the current customer supplied encryption key to the decrypt key. So let's go to nano.boto. I'm going to find the comment out the line that I added earlier. I should've noted the line number, so that I wouldn't have to find it again. Decrypt keys in the GSUtil section. Let's see. I think I'm close. I'm looking for that line. So I'm going to comment out encryption key line and uncomment decryption key one right there. Then, I'm going to copy this into decryption key one, and then we save x. So a best practices you would actually delete the old customer key from the encryption line. But in this case, we just copy pasted it. So it's not a big deal. So I'm going generate a new key and then I'm going back to boto files. So I am going to add a new encryption key line, make sure that I copied the new key I made, and then do the same thing again. Sparsed it so I am adding a new encryption key equals, and I'll paste in the new key. Then control O to save, control X to exit. Now, I'm going to rewrite the key for file 1, and comment out the old decrypt key. Again, to bottom. Then I am going to comment out the decryption key 1. Now, while the instructions have you using nano, you definitely could use the Cloud Shell editor as well. That might be a little more pleasant than using this tool, but I'll leave it to you. You would just access that by hitting this little pencil here. It's fine. Decryption key 1 real quick. So we're commenting that out. Then, we're going to save it, and click. Now, we're going to download setup 2, and download setup 3. What happened, no decryption key matches because we commented it out, which makes sense. So the last task in this lab is, we are going to run the following command to view the current life-cycle policy. So we're going to do this. It says it has no life-cycle configuration. So I'm going to create a JSON lifecycle policy file. I'm going to paste the following rule in here. So it's saying if it's over 31 days I'm going to delete it. Writing exit. Then, to set the policy I'm going to run the command provided in the box, and to verify that the policy worked, I'm going to press that. This is another opportunity for you to check your progress and get more points in the lab. This point you should have about 20 out of 35 points. The task 6 is enabling versioning and you can do that by using the following command. Says it suspended, which means it's not enabled. So if we want to enable versioning, we're going to run this command. Then if we were to run the Get-Command again, we would not see that it was suspended we would say that it was enabled. There it is. So check your progress again you'll get more points. The next step, we're going to create several versions of the sample file in the bucket. So I'm going do an ls here. Going to open the setup HTML file. Delete any five lines to change the size. So I'm going to comment out this link and then I'm going to delete all of these links. Probably a faster way to do this thing just holding down delete. That's what I'm doing here. I'm going to delete it all the way to the banner. So I have now effectively changed the size of the file. So I'm going to control O, enter, control X. I'm going to copy the file to the bucket. I'm going to go back to setup.html, delete another five lines. Let's delete some more links. I'm just going to delete up to here. I'm going to save it. Then I'm going to copy it again. So if I wanted to list all versions of the file, which each subsequent one I was deleting different lines and making the size smaller, I was creating a new version. You can see there are three versions: the original one, the one where I deleted the first five lines, and then the one where I deleted the next set of lines. So I am now going to store the version value in the environment variables. So I'm going to say, export version name equals, the oldest version is this one. I'm going to copy that. I'm going to set this variable here make sure it got set correctly, and it is set correctly. Now, I'm going to download the oldest version, call it recovery.text. Then I'm going to verify recovery with a couple of commands. It is saying, c ls setup.html. Looks like that piece didn't work. I think what I did was I set the version name to the wrong thing, it should have been here. So now I can do the Gsutil again, and it still didn't match. So you do this, it's because you didn't follow instructions like me. You should have copied the entire URL for that object. Usually, what happens with the lab is if you have an issue is usually not that the lab is broken. It's usually that you missed a step. So go back three steps and repeat, and that usually works, because you can see here that just worked. Ls.al Setup.html, there's the file. I want to see the recovered text. You can see that the size is different here. So task 7, we're going to synchronize a directory to a bucket and just copy these in. Then I'm going to sync the first-level directory on the VM with my bucket. I'm going to verify that versioning was enabled. How I can check in the browser, I'm going to refresh the bucket, and we go back here, first level. You can see there's a second level. We can see the same thing in the console as we do in the command line. So I can exit Cloud Shell. So now we're going to do some cross-project sharing, this is the last little piece of this lab. So I'm going to open another tab. I'm also going to go to console.cloud.google.com, and I am now signed in, I'm going to select the other project. This one I have 26. I am going to copy the project from the Qwiklabs site, I'm here at lab guide, and I'm going to select that project. This is my other project. Then I am going to now create a bucket for this project. There shouldn't be one in here because it's a new project. I'm going to also call it myproj in project ID, and Create. This will now be bucket name two. So I'm going to upload a file, any file. I've uploaded a screenshot, and this will be my file name. So I'm actually going to rename it. I can't. Now I'm going to go to IAM, service accounts, and then create a service account. I'm going to call it cross-project-storage, and click create. Then I'm going to give it the Storage Object Viewer. Click continue. I'm going to create a key. I'm going to select JSON, create, and then it's going to download that file for me. It's there. Hit close, and I can hit done. So I am now going to rename this credentials.json. Here it is. I'm going to switch back to the other project, check my progress, and I should get five more points. So now we are just five points away from finishing the lab. Now we are in project ID one, and we're going to create a VM, create. Calling it crossproject. I'm going to make it in Europe in D, and I'm making it a micro, and create. VM is ready, I'm going to SSH into it. There it is, click SSH, then move my window back here to get the bucket name of the project I created here. I'm going to verify that it worked, and then I'm going to export the file name of the file that I uploaded. Grab that, put quotes around there because that space is in it. Verify that worked, and there it is. LS what's in that bucket. Form a VM on this side, it tells me that I don't have access to do that, so now I'm going to verify that. I am going to upload here, upload file. I'm going to select the credentials.json that I downloaded. Close, then I'm going to authorize that file to verify access. I'm going to do this again, and now I can see my file in there. I can do it with the file as well. Let me try to copy these credentials so they don't have access to that project. So if I wanted to do that, I would go back to this project, and modify the role in IAM. It Should be my last step here, going back to IAM, cross-project-storage, pencil. I'm also going to give Storage Object Admin, save. Once I hit save, I can check my progress, and then you will have all of the points in the lab. The last step is optional, you're just going to return to your SSH terminal, and verify that everything is good to go but that is the entire walkthrough for this lab. I hope you enjoyed it.

2. Let's practice!

Create Your Free Account

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.