logical-neural network of FP

问题描述

neural network of FP
最近學習機械學習中對於backpropagation(BP)和FP不是很了解
請問一下高手在FP中的代價函數我無法找到相對應的Y
我是以logical classification 的想法下去思考
我的數學表達式如下
Theta=(n+1)*1的矩陣
Input_layer=m*(n+1)的矩陣
y=m*1的矩陣
設sigmoid(預測輸出) 是 1/(1+e^Input_layer*Theta)
所以我對代價函數的作法是
erro=(sigmoid-y)of each elememt ^2;
m=size(traning set of number);
cost function =sum(erro)/2m;
可是對於FP而言我只有想到inputer layer hide layer output later
所以我無法找到一個y去對應
因為size(inputer layer of number)!=size(hide layer of number)...
想請問我要如何求出neural network 的代價函數

时间: 2024-10-31 14:06:22

logical-neural network of FP的相关文章

A Neural Network Approach to Context-Sensitive Generation of

本文分享的这篇paper是旨在训练一个data driven open-domain的bot,在生成response的时候不仅仅考虑user message(query),而且考虑past history作为context.paper的题目是A Neural Network Approach to Context-Sensitive Generation of Conversational Responses,作者来自蒙特利尔大学.乔治亚理工.facebook和微软研究院,本文最早发于2015年

(zhuan) Recurrent Neural Network

  Recurrent Neural Network  2016年07月01日  Deep learning  Deep learning 字数:24235   this blog from: http://jxgu.cc/blog/recent-advances-in-RNN.html    References Robert Dionne Neural Network Paper Notes Baisc Improvements 20170326 Learning Simpler Langu

论文笔记之:Progressive Neural Network Google DeepMind

  Progressive Neural Network  Google DeepMind   摘要:学习去解决任务的复杂序列 --- 结合 transfer (迁移),并且避免 catastrophic forgetting (灾难性遗忘) --- 对于达到 human-level intelligence 仍然是一个关键性的难题.本文提出的 progressive networks approach 朝这个方向迈了一大步:他们对 forgetting 免疫,并且可以结合 prior know

(转)LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION

  LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION Wed 21st Dec 2016   Neural Networks these days are the "go to" thing when talking about new fads in machine learning. As such, there's a plethora of courses and tutorials out there on the basic va

(转)The Neural Network Zoo

  转自:http://www.asimovinstitute.org/neural-network-zoo/     THE NEURAL NETWORK ZOO POSTED ON SEPTEMBER 14, 2016 BY FJODOR VAN VEEN   With new neural network architectures popping up every now and then, it's hard to keep track of them all. Knowing all

(zhuan) LSTM Neural Network for Time Series Prediction

LSTM Neural Network for Time Series Prediction  Wed 21st Dec 2016 Neural Networks these days are the "go to" thing when talking about new fads in machine learning. As such, there's a plethora of courses and tutorials out there on the basic vanil

论文笔记之:Hybrid computing using a neural network with dynamic external memory

  Hybrid computing using a neural network with dynamic external memory Nature  2016    原文链接:http://www.nature.com/nature/journal/vaop/ncurrent/pdf/nature20101.pdf    摘要:人工智能神经网络 在感知处理,序列学习,强化学习领域得到了非常大的成功,但是限制于其表示变量和数据结构的能力,长时间存储知识的能力,因为其缺少一个额外的记忆单元.

论文笔记之:Decoupled Deep Neural Network for Semi-supervised Semantic Segmentation

  Decoupled Deep Neural Network for Semi-supervised Semantic Segmentation   xx  

9 篇顶会论文解读推荐中的序列化建模:Session-based Neural Recommendation

前言 本文对 Recurrent Neural Network 在推荐领域的序列数据建模进行梳理,整理推荐领域和深度学习领域顶会 RecSys.ICLR 等中的 9 篇论文进行整理.图片和文字来源于原文,帮助读者理解,有争议的请联系我. Session-based neural recommendation 首先介绍下 session-based 的概念:session 是服务器端用来记录识别用户的一种机制.典型的场景比如购物车,服务端为特定的对象创建了特定的 Session,用于标识这个对象,