Pandas titanic homework

https://docs.google.com/document/d/1wykki4M2vSAFYc3S1QkDINfBDBSEzgR29kcBH-Yx3SU/edit?usp=sharing

IP

47.243.x.x

Token

student@hku.hk

Homework

Download the data file titanic.csv

Download the homework file.

Submit the file the ‘python code file’ and final ‘titanic_eda.csv’ file

Lab

  1. Build your Llama2 LLM solution with PAI-EAS and AnalyticDB for PostgreSQL (200credits)
    https://labex.io/courses/build-your-llama2-llm-solution-with-pai-eas-and-adb-pg
  • This lab introduces how to deploy an enterprise-level AI knowledge base dialogue using AnalyticDB for PostgreSQL and PAI-EAS. It utilizes AnalyticDB for PostgreSQL for retrieval of enterprise-specific knowledge base and PAI-EAS for AI language model inference. These two components are seamless.
  1. Use AI Container Image to Deploy the Qwen Large Language Model (200credits)
    https://labex.io/courses/use-ai-container-image-to-deploy-the-qwen-large-language-model
  • This lab will use Alibaba Cloud AI Containers (AC2) container image service to deploy the Qwen series large language models through Docker container images.
  1. PAI-EAS Quick Deployment of AI Painting Stable Diffusion WebUI (200credits)
    https://labex.io/courses/pai-eas-quick-deployment-of-ai-painting-stable-diffusion-webui
  • In this lab, you will learn how to use the pre-configured image of Alibaba Cloud’s PAI-EAS model online service to quickly deploy an AI-Web application for AIGC Stable Diffusion WebUI painting and start the WebUI for model inference.

Leave a Reply