Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
Download
2159 views
ubuntu2204
Kernel: Python 3 (system-wide)

Installing their python lib. This will be installed globally by default soon...

#! pip3 install google-cloud-aiplatform

You need a project in GCP and create an service account, which has the role "Vertex AI User". The environment variable below points to that file.

import os from pprint import pprint from IPython.display import Markdown # this is a key file for a service account, which only has the role "Vertex AI User" os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/home/user/smi-gcp-vertex-ai-service-account.json'
import vertexai from vertexai.preview.language_models import TextGenerationModel
def palm_llm( project_id: str, model_name: str, temperature: float, max_decode_steps: int, top_p: float, top_k: int, content: str, location: str = "us-central1", tuned_model_name: str = "", ): """Predict using a Large Language Model.""" vertexai.init(project=project_id, location=location) model = TextGenerationModel.from_pretrained(model_name) if tuned_model_name: model = model.get_tuned_model(tuned_model_name) response = model.predict( content, temperature=temperature, max_output_tokens=max_decode_steps, top_k=top_k, top_p=top_p, ) return response.text
palm = lambda prompt: palm_llm( "sage-math-inc", "text-bison", # "text-bison@001" ... without versioning, it's the "latest" 0.5, # rather low temperature, can go up to 1. default 0.2, changing to 0.5 just to see what happens 512, # number of tokens, default 256, setting to 512 0.8, # top-p: most probable? no clue what this is. Top-p changes how the model selects tokens for output. Tokens are selected from most probable to least until the sum of their probabilities equals the top-p value. For example, if tokens A, B, and C have a probability of .3, .2, and .1 and the top-p value is .5, then the model will select either A or B as the next token (using temperature). The default top-p value is .8. 40, # top-k for next token. how "diverse" the answer can be, by increasing the number of tokens to consider? prompt, "us-central1")
Markdown(palm('show me in python, how two mutually recursive functions can compute fibonacci numbers.'))
def fib_recursive(n): if n <= 1: return n else: return fib_recursive(n - 1) + fib_recursive(n - 2) def fib_iterative(n): a, b = 0, 1 for _ in range(n): a, b = b, a + b return a
print(palm(''' Improve the language of the following news accouncement and expand the text a little bit, while keeping the markdown formatting as it is: """ The 22.04 line of software environments just got an update. It includes the [Macaulay2]() Jupyter Kernel. Here is an [m2 example notebook](https://cocalc.com/hsy/ubuntu-22.04-testing/macaulay2). Besides that, many new and updates packages are available now. E.g. for the Python3 (system-wide) environment: * [GerryChain](https://gerrychain.readthedocs.io) - *a library for using Markov Chain Monte Carlo methods to study the problem of political redistricting.* * [cirq](https://github.com/quantumlib/Cirq) - *creating, editing, and invoking Noisy Intermediate Scale Quantum (NISQ) circuits* * [tequila](https://github.com/tequilahub/tequila) - *A High-Level Abstraction Framework for Quantum Algorithms* As usual, you can switch back to the previous 22.04 environment via Project Settings → Project Control → Software Environment: "Ubuntu 22.04 // Previous". Please report any issues! """ '''))
The 22.04 line of software environments just got an update. It includes the [Macaulay2](https://cocalc.com/hsy/ubuntu-22.04-testing/macaulay2) Jupyter Kernel. Besides that, many new and updated packages are available now. E.g. for the Python3 (system-wide) environment: * [GerryChain](https://gerrychain.readthedocs.io) - a library for using Markov Chain Monte Carlo methods to study the problem of political redistricting. * [cirq](https://github.com/quantumlib/Cirq) - creating, editing, and invoking Noisy Intermediate Scale Quantum (NISQ) circuits * [tequila](https://github.com/tequilahub/tequila) - A High-Level Abstraction Framework for Quantum Algorithms As usual, you can switch back to the previous 22.04 environment via Project Settings → Project Control → Software Environment: "Ubuntu 22.04 // Previous". Please report any issues!
Markdown(palm('show me, what mutually recursive functions are, using python'))
def factorial(n): if n == 0: return 1 else: return n * factorial(n-1) def fibonacci(n): if n == 0 or n == 1: return n else: return fibonacci(n-1) + fibonacci(n-2)
Markdown(palm("""\ Explain the following Python 3 (system-wide) code that is in a Jupyter notebook: ```python import pandas as pd data = [{"a": 1, "b": 1}, {"a": 123}] x = pd.DataFrame(data) x.to_excel('data.xls') ``` """))

This code imports the pandas library, creates a list of dictionaries, and then converts the list to a pandas DataFrame. The DataFrame is then saved to an Excel file named 'data.xls'.

Markdown(palm('''\ Show me how to make a table with 3 columns, where the first two are aligned left, and the last one aligned right. that table should have full width on the page.: I am writing in the file latex-tests/chatgpt.tex LaTeX. '''))
\documentclass{article} \usepackage{fullpage} \usepackage{tabularx} \begin{document} \begin{table*} \centering \begin{tabularx}{\textwidth}{|l|l|r|} \hline \textbf{First Column} & \textbf{Second Column} & \textbf{Third Column} \\ \hline Left-aligned text & Left-aligned text & Right-aligned text \\ \hline \end{tabularx} \end{table*} \end{document}

Chat Model

https://cloud.google.com/python/docs/reference/aiplatform/latest/vertexai.preview.language_models.ChatModel

The basic idea is to start with a message by the "user", then record what the "bot" replied, and then continue with the "user" message. i.e. you send an odd number of messages, and the prediction completes this to an even number of messages.

The "context" is a general setup for the chat, use it to tune the response, define the target audience (children, students, ...), it's verbosity/conciseness, etc.

import vertexai from vertexai.preview.language_models import ChatModel, InputOutputTextPair def predict_large_language_model_sample(project_id: str, model_name: str, temperature: float, max_output_tokens: int, top_p: float, top_k: int, location: str = "us-central1", context="", question=""): vertexai.init(project=project_id, location=location) chat_model = ChatModel.from_pretrained(model_name) parameters = { "temperature": temperature, "max_output_tokens": max_output_tokens, "top_p": top_p, "top_k": top_k, } chat = chat_model.start_chat(context=context) response = chat.send_message(question, **parameters) print(response.text) return response chat = lambda context, question: predict_large_language_model_sample( "sage-math-inc", "chat-bison@001", 0.8, 512, 0.8, 40, "us-central1", context, question)
chat_context ='Enclose all Math in $. Include the language directly after the triple backticks in all markdown code blocks. Be concise. You are talking to a student at a university or college.' r1 = chat(chat_context, "Show me how to differentiate f(x)=(1-x^2)/(2 x + 1)^2 in sympy")
``` >>> import sympy as sym >>> x = sym.Symbol('x') >>> f = (1 - x**2)/(2*x + 1)**2 >>> sym.diff(f, x) -(4*x**3 - 4*x**2 + 2)/(2*x + 1)**3 ```

vertexai.preview.language_models.ChatSession

from vertexai.preview.language_models import ChatModel, ChatSession, InputOutputTextPair def create_chat_session(): vertexai.init(project='sage-math-inc', location="us-central1") model = ChatModel.from_pretrained('chat-bison@001') parameters = { "temperature": 0.5, "max_output_tokens": 512, "top_p": 0.8, "top_k": 40, } session = ChatSession(model, chat_context, **parameters) return session
session = create_chat_session()
resp_q = session.send_message('How can I write a word with a large and bold font in LaTeX?')
Markdown(resp_q.text)

You can write a word with a large and bold font in LaTeX by using the \textbf{command}. For example, the following code will produce the word "Hello" in a large and bold font:

\textbf{Hello}

It still knows I'm talking about LaTeX

Markdown(session.send_message("Thank you, and how to change the font color?").text)

You can change the font color in LaTeX by using the \textcolor{color}{text} command. For example, the following code will produce the word "Hello" in a red font:

\textcolor{red}{Hello}

full conversation history using PredictionServiceClient.predict

from google.cloud import aiplatform from google.cloud.aiplatform.gapic.schema import predict from google.protobuf import json_format from google.protobuf.struct_pb2 import Value def predict_large_language_model_sample( api_endpoint: str, endpoint: str, input: str, parameters: str, location: str = "us-east1", ): # The AI Platform services require regional API endpoints. client_options = {"api_endpoint": api_endpoint} # Initialize client that will be used to create and send requests. # This client only needs to be created once, and can be reused for multiple requests. client = aiplatform.gapic.PredictionServiceClient( client_options=client_options) instance_dict = input instance = json_format.ParseDict(instance_dict, Value()) instances = [instance] parameters_dict = parameters parameters = json_format.ParseDict(parameters_dict, Value()) response = client.predict(endpoint=endpoint, instances=instances, parameters=parameters) print("response") predictions = response.predictions for prediction in predictions: print("prediction:\n", dict(prediction)) return predictions
resp = predict_large_language_model_sample( "us-central1-aiplatform.googleapis.com", "projects/sage-math-inc/locations/us-central1/publishers/google/models/codechat-bison@001", { "messages": [{ "author": "user", "content": "what is the true color of the sky?" }] }, { "temperature": 0.5, "maxOutputTokens": 512 }) resp
response prediction: {'citationMetadata': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a40a0>], 'safetyAttributes': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a40a0>], 'candidates': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a40a0>]}
[<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a7eb0>]
a = dict(resp[0]) a
{'citationMetadata': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a79d0>], 'safetyAttributes': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a79d0>], 'candidates': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a79d0>]}
Markdown(a['candidates'][0]['content'])

The sky is blue because of a phenomenon called Rayleigh scattering. This is the scattering of light by particles that are smaller than the wavelength of light. The particles in the atmosphere that cause Rayleigh scattering are molecules of nitrogen and oxygen.

When sunlight enters the atmosphere, it is scattered in all directions by these molecules. The blue light is scattered more than the other colors because it travels as shorter, smaller waves. This is why we see a blue sky most of the time.

However, the sky can appear different colors at different times of day. At sunrise and sunset, the sun is low in the sky and the sunlight has to travel through more of the atmosphere to reach our eyes. This means that more of the blue light is scattered away, and we see more of the other colors, such as red and orange.

a['citationMetadata'][0]['citations']
[<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a7010>]
resp2 = predict_large_language_model_sample( "us-central1-aiplatform.googleapis.com", "projects/sage-math-inc/locations/us-central1/publishers/google/models/codechat-bison@001", { "messages": [{ "author": "user", "content": "what is the true color of the sky?" }, { "author": "bot", "content": 'The sky is blue because of a phenomenon called Rayleigh scattering. This is the scattering of light by particles that are smaller than the wavelength of light. The particles in the atmosphere that cause Rayleigh scattering are molecules of nitrogen and oxygen.\n\nWhen sunlight enters the atmosphere, it is scattered in all directions by these molecules. Blue light is scattered more than other colors because it travels as shorter, smaller waves. This is why the sky appears blue during the day.\n\nThe sky can appear different colors at different times of day and in different locations. For example, the sky can appear red at sunrise and sunset. This is because the sunlight has to travel through more of the atmosphere to reach our eyes at these times. The blue light is scattered away, leaving the red light to reach our eyes.\n\nThe sky can also appear different colors in different locations. For example, the sky can appear more blue over the ocean than over land. This is because the ocean reflects more blue light than land does.' }, { "author": "user", "content": "is this the same reason, why the oceans are blue?" }] }, { "temperature": 0.5, "maxOutputTokens": 512 }) resp2
response prediction: {'candidates': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a64a0>], 'safetyAttributes': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a64a0>], 'citationMetadata': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a64a0>]}
[<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a79d0>]
a = dict(resp2[0]) Markdown(a['candidates'][0]['content'])

Yes, the same principle applies to the oceans. The ocean appears blue because it reflects more blue light than other colors. This is because water molecules are smaller than the wavelength of blue light, so they can absorb and scatter blue light more easily than other colors.

First, asking about Calculus in SymPy. Then switching gears to SageMath, by simply asking "and how would this work in sagemath?"

resp3 = predict_large_language_model_sample( "us-central1-aiplatform.googleapis.com", "projects/sage-math-inc/locations/us-central1/publishers/google/models/codechat-bison@001", { "messages": [{ "author": "user", "content": "show me how to differentiate in sympy" }, { "author": "bot", "content": "To differentiate a function in sympy, you can use the `diff` function. For example, to differentiate the function `y = x^2`, you would use the following code:\n\n```\n>>> import sympy\n>>> x = sympy.Symbol('x')\n>>> y = x**2\n>>> sympy.diff(y, x)\n```\n\nThis will return the expression `2*x`, which is the derivative of `y = x^2`.\n\nYou can also use the `diff` function to differentiate a function with respect to a different variable. For example, to differentiate the function `y = x^2` with respect to `y`, you would use the following code:\n\n```\n>>> sympy.diff(y, y)\n```\n\nThis will return the expression `0`, which is the derivative of `y = x^2` with respect to `y`.", }, { "author": "user", "content": "and how integrate symbolically?" }, { "author": "bot", "content": "To integrate a function in sympy, you can use the `integrate` function. For example, to integrate the function `x^2`, you would use the following code:\n\n```\n>>> import sympy\n>>> x = sympy.Symbol('x')\n>>> sympy.integrate(x**2)\n```\n\nThis will return the expression `x**3/3`, which is the indefinite integral of `x^2`.\n\nYou can also use the `integrate` function to integrate a function with respect to a different variable. For example, to integrate the function `x^2` with respect to `y`, you would use the following code:\n\n```\n>>> sympy.integrate(x**2, y)\n```\n\nThis will return the expression `x**3/3*y`, which is the definite integral of `x^2` with respect to `y`.", }, { "author": "user", "content": "and how would this work in sagemath?" }] }, { "temperature": 0.5, "maxOutputTokens": 512 }, "us-central1") resp3
response prediction: {'candidates': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a7310>], 'safetyAttributes': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a7310>], 'citationMetadata': [<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a7310>]}
[<proto.marshal.collections.maps.MapComposite object at 0x7f5ab05a5ff0>]
a = dict(resp3[0]) Markdown(a['candidates'][0]["content"])

To differentiate a function in SageMath, you can use the diff function. For example, to differentiate the function y = x^2, you would use the following code:

sage: x = var('x') sage: y = x**2 sage: diff(y, x)

This will return the expression 2*x, which is the derivative of y = x^2.

You can also use the diff function to differentiate a function with respect to a different variable. For example, to differentiate the function y = x^2 with respect to y, you would use the following code:

sage: diff(y, y)

This will return the expression 0, which is the derivative of y = x^2 with respect to y.

To integrate a function in SageMath, you can use the integrate function. For example, to integrate the function x^2, you would use the following code:

sage: integrate(x**2)

This will return the expression x**3/3, which is the indefinite integral of x^2.

You can also use the integrate function to integrate a function with respect to a different variable. For example, to integrate the function x^2 with respect to y, you would use the following code:

sage: integrate(x**2, y)

This will return the expression x**3/3*y, which is the definite integral of x^2 with respect to y.