Maclaurin Expansion of the Telegrapher’s Equation and Python Implementation

Kac(Kac, 1974) proved his stochastic model can express the telegrapher’s equa- tion(1). In this report, I would like to predict a function F(x, t) with using the Maclaurin ex- pansion. The expansion of F(x, t) is showed like (2). When we define the partial derivative of F(x, t) with respect to t = 0 as φ_i(3), and initial conditions are fixed as (4, 5) in this case. Then, we can regard (1) as (6). Finally, I implemented this expansion in terms of t of range between 0 and 3. Especially, in a case of F(0, 0.5), the output is 0.256581271960677.

http://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.rmjm/1250130879

[=figure1]

#-*- coding:utf-8 -*-
from __future__ import division
import numpy as np
from sympy import *
from mpmath import *
import matplotlib.pyplot as plt

def kac_Maclaurin(t):
	n_max = 52; X = 0.0; a = 2.0; v = 1.0;
	x = Symbol('x');
	phi = [];
	phi.append( (np.e**(-x**2))*(0.1+(x-0.25)**2)); phi.append(0*x); 
	F = phi[0].subs(x, X) + phi[1].subs(x, X);
	p = 1
	for n in xrange(2, n_max):
		phi.append( v**2 *phi[n-2].diff(x,2) - 2*a*phi[n-1] )
		p *= n
		F += phi[n].subs(x, X)/p * (t ** n)
	return F

def main():
	max_iter = 301;
	T = np.linspace(0, 3, max_iter);
	N = np.zeros((1, max_iter));
	for t in xrange(max_iter):
		N[0, t] =	kac_Maclaurin(T[t]);
	plt.xlabel('t');
	plt.ylabel('F(0, t)');
	plt.plot(T, N[0, :]);
	plt.show()
	
	for i in xrange(max_iter):
		print T[i], N[0, i]

if __name__ == "__main__":
	main()

Science of the appetite(Sakurai, 2012)

Chapter 1 Origin of appetite
Feature of weight: homeostatically weight is controlled
a specific part of brain: the hypothalamus which is related to basic desires such as appetite, sleep, emotion.
fear and excitement is sent from the limbic system to the the hypothalamus, and affect the Autonomic nervous system as well as secretion of hormones.
the supraoptic nucleus(視交叉上核) is in charge of the circadian rhythm.

Dual-center theory- origin of how to uncover the function of the hypothalamus
1942: a lesion of the ventromedial nucleus of the hypothalamus would induce that rats cannot stop feeding foods and getting fat(Hetherington et al., 1942)
1951: a lesion of the lateral nucleus of the hypothalamus in rats and cats intensively inhibit the amount of eating of them(Anand et al., 1952).
opposite regulations between the ventromedial nucleus and the lateral nucleus of the hypothalamus- "the satiety center" and "feeding center"
Dual-center theory:
"the satiety center"- inhibit the appetite
"feeding center"- facilitate the appetite, related to motivation and awaking.
Human cases:
the lateral hypothalamic syndrome: disorders of the lateral nucleus induce the anorexia(食欲不振)
the ventromedial hypothalamic syndrome: disorders of the ventromedial nucleus induce the bulimia(過食)

Q1. Does the appetite function as caution system for checking energy of brain?
No. This is because the brain is quite vulnerable. In terms of this, body tissues should send deficiency signals to the brain, the brain detect the deficiency of body energy, and present the sense of appetite.

How to send these signals to the hypothalamus
Lipostatic theory- the adipose tissues signal the satiety factors-lipostat-
1950s: In the blood of the lateral-pypolamically lesioned rat, there would exist satiety factors(Hervey et al., 1959).
Parabiosis- incising the abdominal region, and stitching another mouse's abdominal regions together
Hervey's hypothesis:
The cause of the bulimia and the obesity in ventromedial-hypothalamicly lesioned rats is related to deficiency of factors in bloods.
The result of the experiment:
In parabiosis rats with a ventromedial-hypothalamicly lesioned rat and a controlled rat
the prior: no changes
the latter: lost his weights.

1953: Kenedy's theory: the cause of homeostasis of weight results from the adipose tissues' signals

Glucostatic theory-
1959: Mayer hypothesized that in the hypothalamus there exists the neurons which can detect the concentration of Glucose, and increase of it inhibit the feeding center although the satiety center is facilitated(Glucostatic theory)(Mayer et al., 1959)
1969: Omura discovered that there are the neurons excited by glucose in the satiety center while the inhibited neurons excist in the feeding center electrophysiologically.

Fatty acid inhibit the satiety center neurons though it facilitates the feeding center

trianble.py

#-*- coding:utf - 8
#1-6 気軽にウォーミングアップ p21-22

#input
#n = 5
#a = [2, 3, 4, 5, 10]
n = 4
a = [4,5,10,20]

def solve():
	#answer
	ans = 0
	
	#we do not choose choices with duplication
	for i in xrange(n):
		for j in xrange(i+1, n):
			for k in xrange(j+1, n):
				len = a[i] + a[j] + a[k] #circumference
				ma =max(a[i], max(a[j], a[k])) #longest length of bar
				rest = len - ma #sum of other two bars' length
				
				if ma < rest:
					ans = max(ans, len)
	return ans

Model-based influences on humans' choices and stratal prediction errors(Daw, 2011)

Abstract:
The mesostriatal dopamine system is prominently implicated in model-free reinforcement learning, with fMRI BOLD signals in ventral striatum notably covarying with model-free prediction errors. However, latent learning and devaluation studies show that behavior also shows hallmarks of model-based planning, and the interaction between model-based and model-free values, prediction errors, and preferences is underexplored. We designed a multistep decision task in which model-based and model-free influences on human choice behavior could be distinguished. By showing that choices reflected both influences we could then test the purity of the ventral striatal BOLD signal as a model-free report. Contrary to expectations, the signal reflected both model-free and model-based predictions in proportions matching those that best explained choice behavior. These results challenge the notion of a separate model-free learner and suggest a more integrated computational architecture for high-level human decision-making.

Game theory and neural basis of social decision making(Lee 2009)

Abstract:
Decision making in a social group has two distinguishing features. First, humans and other animals routinely alter their behavior in response to changes in their physical and social environment. As a result, the outcomes of decisions that depend on the behavior of multiple decision makers are difficult to predict and require highly adaptive decision-making strategies. Second, decision makers may have preferences regarding consequences to other individuals and therefore choose their actions to improve or reduce the well-being of others. Many neurobiological studies have exploited game theory to probe the neural basis of decision making and suggested that these features of social decision making might be reflected in the functions of brain areas involved in reward evaluation and reinforcement learning. Molecular genetic studies have also begun to identify genetic mechanisms for personal traits related to reinforcement learning and complex social decision making, further illuminating the biological basis of social behavior.

Project Euler76(python, 0.008590s)

#-*- coding:utf-8 -*-
import numpy as np
from datetime import datetime

def Euler76(A, n):
	A[0][0] =1
	for i in xrange(n):
		for j in xrange(i + 1):
			if j == 1:
				A[i,j] = 1
			elif (i == j):
				A[i,j] = 1
			else:
				A[i, j] = A[i-j, j] + A[i-1][j-1]
	return sum(A[100][1:100])

def main():
	A = np.zeros((101,101))
	n = 101
	start = datetime.now()
	answer = Euler76(A, n)
	end = datetime.now()
	print end - start, answer
	
if __name__ == "__main__":
	main()

Euler 87(python: 1.019476s)

#-*- coding:utf-8 -*-
from datetime import datetime
import random
import numpy as np

#素数判定
def is_prime3(q,k=50):
	q = abs(q)
	if q == 2: return True
	if q < 2 or q&1 == 0: return False
	d = (q-1)>>1
	while d&1 == 0:
		d >>= 1
	for i in xrange(k):
		a = random.randint(1,q-1)
		t = d
		y = pow(a,t,q)
		while t != q-1 and y != 1 and y != q-1: 
			y = pow(y,2,q)
			t <<= 1
		if y != q-1 and t&1 == 0:
			return False
	return True

def Euler87():
	below = 50000000
	max = 7072
	numbers = set([])
	prime = [i for i in xrange(1,max) if is_prime3(i)]
	prime2 = [i for i in prime if i ** 3 < below]
	prime3 = [i for i in prime if i ** 4 < below]
	for i in prime3:
		for j in prime2:
			for k in prime:
				S = k ** 2 + j ** 3 + i ** 4
				if S > below: break
				if S not in numbers:	numbers.add(S)
	return len(numbers)

def main():
	start = datetime.now()
	answer = Euler87()
	end = datetime.now()
	print end - start, answer
	
if __name__ == "__main__":
	main()