- Data Pragmatist
- Posts
- Kolmogorov–Arnold Networks (KAN): A New Era in AI
Kolmogorov–Arnold Networks (KAN): A New Era in AI
Perplexity's new revenue sharing plan
Welcome to learning edition of the Data Pragmatist, your dose of all things data science and AI.
📖 Estimated Reading Time: 5 minutes. Missed our previous editions?
🤖 Instagram now lets you create an AI chatbot of yourself LINK
Meta has released a new tool called AI Studio, enabling users in the US to create AI characters on Instagram or the web to interact with followers on their behalf.
These AI profiles can engage in direct chat threads, respond to comments, and are customizable based on the creator’s Instagram content and specified interaction guidelines.
In addition to creating personalized AI, users can also design entirely new characters to use across Meta’s platforms, with Meta ensuring these AI profiles are clearly labeled to avoid confusion.
💸 Perplexity's new revenue sharing plan LINK
Perplexity has started a program to share advertising revenue with publishers after facing plagiarism accusations from several media outlets.
The "Publishers' Program" includes partners like Time, Der Spiegel, and Automattic, who will receive a portion of ad revenue for their content used by Perplexity.
This initiative follows investigations by Forbes and Wired, which reported Perplexity’s AI misusing and paraphrasing their articles without proper attribution.
The fastest way to build AI apps
Writer Framework: build Python apps with drag-and-drop UI
API and SDKs to integrate into your codebase
Intuitive no-code tools for business users
🧠 Kolmogorov–Arnold Networks (KAN): A New Era in AI
In AI, the Multi-Layer Perceptron (MLP) has been foundational. However, Kolmogorov–Arnold Networks (KAN) are poised to revolutionize this foundation. Inspired by the Kolmogorov-Arnold representation theorem, KANs decompose complex multivariate functions into simpler one-dimensional functions using splines, specifically B-splines.
What Are Kolmogorov–Arnold Networks?
KANs leverage the Kolmogorov-Arnold theorem, which states that complex functions can be broken down into simpler components. Splines, which create smooth curves through control points, are the key to this decomposition. This approach allows KANs to replace traditional fixed activation functions in neural networks with learnable B-splines, resulting in a more adaptive and interpretable model.
Advantages of KANs
Enhanced Scalability: KANs handle high-dimensional data efficiently by breaking down complex functions into manageable components, making them ideal for large datasets.
Improved Accuracy: KANs achieve higher accuracy with fewer parameters by adaptively modeling relationships within data.
Interpretable Models: The structure of KANs offers transparency, enabling the extraction of symbolic formulas that represent learned patterns, unlike traditional black-box models.
Implementing KANs with PyKAN
Dataset Creation
import matplotlib.pyplot as plt
from sklearn.datasets import make_moons
import torch
import numpy as np
train_input, train_label = make_moons(n_samples=1000, noise=0.1)
test_input, test_label = make_moons(n_samples=1000, noise=0.1)
dataset = {
'train_input': torch.from_numpy(train_input),
'test_input': torch.from_numpy(test_input),
'train_label': torch.from_numpy(train_label),
'test_label': torch.from_numpy(test_label)
}
X = dataset['train_input']
y = dataset['train_label']
plt.scatter(X[:,0], X[:,1], c=y[:])
plt.show()
Creating and Training a KAN
from kan import KAN
model = KAN(width=[2,2], grid=3, k=3)
def train_acc():
return torch.mean((torch.argmax(model(dataset['train_input']), dim=1) == dataset['train_label']).float())
def test_acc():
return torch.mean((torch.argmax(model(dataset['test_input']), dim=1) == dataset['test_label']).float())
results = model.train(dataset, opt="LBFGS", steps=20, metrics=(train_acc, test_acc), loss_fn=torch.nn.CrossEntropyLoss())
Obtaining Symbolic Formulas and Accuracy
formula1, formula2 = model.symbolic_formula()[0]
def acc(formula1, formula2, X, y):
correct = sum((np.array(formula2.subs('x_1', X[i,0]).subs('x_2', X[i,1])).astype(np.float64) >
np.array(formula1.subs('x_1', X[i,0]).subs('x_2', X[i,1])).astype(np.float64)) == y[i] for i in range(X.shape[0]))
return correct / X.shape[0]
print('Train accuracy:', acc(formula1, formula2, dataset['train_input'], dataset['train_label']))
print('Test accuracy:', acc(formula1, formula2, dataset['test_input'], dataset['test_label']))
In conclusion, Kolmogorov–Arnold Networks (KANs) represent a transformative shift in neural network architecture, offering improved scalability, accuracy, and interpretability. With the new PyKAN library, implementing KANs is accessible, heralding a new era in machine learning and data science.
Top 5 ChatGPT prompts to help with your job search
Resume and Cover Letter Assistance:
"Can you help me write a resume for a [job title] position, highlighting my skills in [relevant skills] and experience in [relevant experience]?"
"I need help drafting a cover letter for a [job title] position at [company]. Can you help me make it compelling and tailored to the job description?"
Job Search Strategies:
"What are some effective strategies for searching for jobs in the [industry/field]?"
"How can I utilize LinkedIn effectively for my job search in [specific industry or role]?"
Interview Preparation:
"Can you provide a list of common interview questions for a [job title] position and tips on how to answer them?"
"How can I prepare for a technical interview for a [specific role, e.g., software engineer, data scientist]?"
Networking Tips:
"How can I network effectively to increase my chances of finding a job in [industry/field]?"
"What are some ways to approach and connect with professionals in [desired industry] on LinkedIn?"
Salary Negotiation:
"What are some tips for negotiating a job offer for a [job title] position?"
"How can I research and determine the appropriate salary range for a [specific role] in [location]?"
How did you like today's email? |
If you are interested in contributing to the newsletter, respond to this email. We are looking for contributions from you — our readers to keep the community alive and going.