Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
Download

Explore partial derivatives and multivariable optimization through interactive SageMath computations including critical point analysis, Lagrange multipliers, and constrained optimization problems. This hands-on Jupyter notebook covers second derivative tests, Hessian matrices, saddle point identification, and practical optimization applications in economics and engineering. CoCalc provides pre-configured computational tools for symbolic differentiation, 3D surface plotting, and gradient descent visualization, allowing students to solve complex optimization problems and understand multivariable calculus concepts through immediate computational feedback.

40 views
ubuntu2404
Kernel: SageMath 10.7

Advanced Calculus with SageMath - Chapter 2

Multivariable Functions and Partial Derivatives

This notebook contains Chapter 2 from the main Advanced Calculus with SageMath notebook.

For the complete course, please refer to the main notebook: Advanced Calculus with SageMath.ipynb

# Comprehensive imports for advanced calculus import numpy as np import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D import scipy.optimize as opt import scipy.integrate as integrate from scipy.integrate import solve_ivp, odeint import sympy as sp from sympy import * from sage.all import * import seaborn as sns # Configure plotting plt.style.use('seaborn-v0_8') sns.set_palette("husl") plt.rcParams['figure.figsize'] = (12, 8) print("Advanced Calculus Environment Initialized") print("Tools: SageMath, NumPy, SciPy, SymPy, Matplotlib") print("Ready for multivariable calculus, vector analysis, and PDEs!")
Advanced Calculus Environment Initialized Tools: SageMath, NumPy, SciPy, SymPy, Matplotlib Ready for multivariable calculus, vector analysis, and PDEs!

Chapter 2: Multivariable Functions and Partial Derivatives

Understanding Functions of Several Variables

A multivariable function maps points in ℝⁿ to ℝ. Examples include:

  • Temperature distribution: T(x,y,z,t)

  • Economic utility: U(x₁,x₂,...,xₙ)

  • Wave equations: ψ(x,y,z,t)

Partial Derivatives and the Gradient

For a function f(x,y), the gradient is: f=(fx,fy)\nabla f = \left(\frac{\partial f}{\partial x}, \frac{\partial f}{\partial y}\right)

The gradient points in the direction of steepest ascent.

# Multivariable functions and partial derivatives print("MULTIVARIABLE FUNCTIONS") print("=" * 50) # Define a multivariable function: f(x,y) = x²y + xy² - 3xy f = x**2 * y + x * y**2 - 3*x*y print("Function: f(x,y) =", f) # Compute partial derivatives df_dx = diff(f, x) df_dy = diff(f, y) print("\nPARTIAL DERIVATIVES") print("∂f/∂x =", df_dx) print("∂f/∂y =", df_dy) # Gradient vector gradient_f = vector([df_dx, df_dy]) print("\n∇f =", gradient_f) # Second-order partial derivatives (Hessian matrix) f_xx = diff(f, x, 2) f_yy = diff(f, y, 2) f_xy = diff(f, x, y) hessian = matrix([[f_xx, f_xy], [f_xy, f_yy]]) print("\nHESSIAN MATRIX") print(hessian) # Evaluate at a specific point point = {x: 2, y: 1} print(f"\nAt point (2,1):") print(f"f(2,1) = {f.subs(point)}") print(f"∇f(2,1) = {gradient_f.subs(point)}")
MULTIVARIABLE FUNCTIONS ================================================== Function: f(x,y) = x^2*y + x*y^2 - 3*x*y PARTIAL DERIVATIVES ∂f/∂x = 2*x*y + y^2 - 3*y ∂f/∂y = x^2 + 2*x*y - 3*x ∇f = (2*x*y + y^2 - 3*y, x^2 + 2*x*y - 3*x) HESSIAN MATRIX [ 2*y 2*x + 2*y - 3] [2*x + 2*y - 3 2*x] At point (2,1): f(2,1) = 0 ∇f(2,1) = (2, 2)
# 3D visualization of multivariable function def plot_3d_function(): # Create meshgrid for plotting x_vals = np.linspace(-3, 3, 50) y_vals = np.linspace(-3, 3, 50) X, Y = np.meshgrid(x_vals, y_vals) # Evaluate function: f(x,y) = x^2 y + x y^2 - 3 x y Z = X**2 * Y + X * Y**2 - 3*X*Y # Create 3D plot fig = plt.figure(figsize=(15, 5)) # Surface plot ax1 = fig.add_subplot(131, projection='3d') surf = ax1.plot_surface(X, Y, Z, cmap='viridis', alpha=0.8) ax1.set_xlabel('x') ax1.set_ylabel('y') ax1.set_zlabel('f(x,y)') ax1.set_title(r'Surface: $f(x,y)=x^2 y + x y^2 - 3 x y$') # Contour plot ax2 = fig.add_subplot(132) contour = ax2.contour(X, Y, Z, levels=20) ax2.clabel(contour, inline=True, fontsize=8) ax2.set_xlabel('x') ax2.set_ylabel('y') ax2.set_title('Contour Lines') ax2.grid(True) # Gradient field ax3 = fig.add_subplot(133) # Compute gradient at grid points dZ_dx = 2*X*Y + Y**2 - 3*Y dZ_dy = X**2 + 2*X*Y - 3*X # Subsample for clarity skip = 3 ax3.quiver(X[::skip, ::skip], Y[::skip, ::skip], dZ_dx[::skip, ::skip], dZ_dy[::skip, ::skip], alpha=0.7) ax3.contour(X, Y, Z, levels=10, alpha=0.3) ax3.set_xlabel('x') ax3.set_ylabel('y') ax3.set_title(r'Gradient Field $\nabla f$') ax3.grid(True) plt.tight_layout() plt.show() plot_3d_function() print("The surface shows function values, contours show level curves,") print("and arrows show the gradient field (direction of steepest ascent)")
Image in a Jupyter notebook
The surface shows function values, contours show level curves, and arrows show the gradient field (direction of steepest ascent)

Continuing Your Learning Journey

You've completed Multivariable Functions and Partial Derivatives! The concepts you've mastered here form essential building blocks for what comes next.

Ready for Optimization in Multiple Dimensions?

In Chapter 3, we'll build upon these foundations to explore even more fascinating aspects of the subject. The knowledge you've gained here will directly apply to the advanced concepts ahead.

What's Next

Chapter 3 will expand your understanding by introducing new techniques and applications that leverage everything you've learned so far.

Continue to Chapter 3: Optimization in Multiple Dimensions →

or

Return to Complete Course