đź‘‹ A space where I teach and share ideas.

When docs and a dinosaur Git along: enabling versioning in Docusaurus

This blog was originally published on Spectro Cloud’s offical blog site. Click here for a direct link to the original blog. Rapid innovation — great for the product, a challenge for docs! Palette is a powerful, versatile product and it changes fast. We issue major releases multiple times per year and there are always minor changes to things like our supported packs and environments. This is a good thing: it means we’re keeping pace with the needs of customers like you, and with the innovation of the cloud-native ecosystem. But it creates a challenge for our documentation , too. Users depend on documentation to navigate the many features of Palette, and to get up to speed with what’s changed in each new version. At the time of this writing, our docs code base has over 435 markdown pages containing product information, and it continues to grow. We love a challenge, and our docs team is always looking at ways to improve not only the depth and quality of the content on our docs site, but also how easy it is to consume.

SpectroMate - An Open-Source Slack integration with Mendable

Innovating the docs experience Like most in the cloud-native community, we’re passionate about great documentation. We’ve felt the joy when discovering a clear, thoughtful guide that helps us get up and running with a new project in our homelabs… and we’ve certainly felt the pain that bad technical writing can cause. So you can understand why our docs team here at Spectro Cloud are always working to improve the experience we offer our users (that’s you). Of course, that includes expanding and refining our content itself — but also innovating how we make that content accessible. SpectroMate is an open-source project I created while working at Spectro Cloud. SpectroMate is an API server with extended functionality designed for Slack integration in the form of a bot. You can use SpectroMate to handle slash commands, and message actions. This article was originally published in Spectro Cloud’s blog. Click on the Article Link to read the original source.

Cloud Cost Questions for Engineering Managers

You have assumed the leadership of a team that is operating in a cloud environment. It’s a new beginning, you are excited about the future (hopefully), the team members, and most of all, the thrill of a new challenge. After the excitement settles down you start asking questions to better understand the work and the team. Among the list of questions you have, you should include questions pertaining to cloud cost and cost optimization. This article was originally published on Medium. Link to the Medium article can be found here. In this article, you will find a set of questions that are beneficial for you and your team to further explore. These are questions I have found beneficial in the past and I believe they will be beneficial to you too. Without further ado, let’s dive into it. Q: Do we have any budget alarms established? This is a simple question but the answer will reveal a lot of information about the team, the organization, and the emphasis placed on cost management.

Go Lambda Cleanup

If you find yourself authoring several AWS lambdas for a serverless application architecture, you might have encountered this error: 1 2 3 4 An error occurred: mySweetLambda– Code storage limit exceeded. (Service: AWSLambda; Status Code: 400; Error Code: CodeStorageExceededException; Request ID: 05d3ae68-a7c2-a3e8-948e-41c2739638af). The first time I encountered this error I wasn’t quite sure what was happening, but after some quick web searches I learned that AWS has a limit on Lambda storage that maxes out at 75Gb. Additionally, I also learned that AWS retains all the previous versions of all my lambdas. That’s all fine, I should probably go do some “spring cleaning” and remove the unused versions. AWS does expose the functionality to remove former versions through the console. However, in my scenario I had over 500+ versions for some of my older lambdas. Clicking through 500+ versions is not how I want to spend my time. So what options are available?

How to use AWS DynamoDB locally…

Chances are most of us have unique situations for wanting to interact with DynamoDB locally, maybe it’s to develop and test different data models, perhaps it’s to develop programmatic functions to interact with the database, perhaps you want to reduce development expenses, or perhaps you’re just doing research. Regardless of your reasons, I want to help you by showing you how to leverage DynamoDB locally. We will use the following tools. Medium Link Localstack Terraform Go AWS CLI noSQL Workbench for DynamoDB We will walk through setting up the local environment, generating data, uploading data, interacting with the noSQL Workbench, and some neat tips to keep in mind. So with that being said, let’s dive into into it! Note: If you get lost, simply visit https://github.com/karl-cardenas-coding/dynamodb-local-example to view the end solution. Also, feel free to fork this template project and use it as a starting point. Setting up the environment First thing first, ensure that you have Terraform (> v0.
0%