فروشگاه فایل یاب

فایل یاب جستجوی انواع فایل آموزشی

فروشگاه فایل یاب

فایل یاب جستجوی انواع فایل آموزشی

Adaptive maintenance policies for aging devices using a Markov decision process

منتشر شده در: IEEE Trans. Power System. دوره: 28، شماره: 3 ، سال انتشار: 2013 تعداد صفحه : 10 نوع این مطلب: journal-article سازمان ناشر: Institute of Electrical & Electronics Engineers (IEEE) نویسندگان مقاله: Saranga K. Abeygunawardane , Panida Jirutitijaroen , Huan Xu کد DOI مقاله: 10.1109/TPWRS.2012.2237042 لینک اطلاعات کامل مقاله: http://dx.doi.org/10.1109/TPWRS.2012.2237042   عنوان مقاله : سیاست‌های حفظ و نگهداری تطبیقی برای تجهیزات فرسوده با استفاده از فرآیند تصمیم‌گیری مارکوف  چکیده فارسی در محیط‌های رقابتی، بیشتر تجهیزات نزدیک به و یا در حدود تعریف شدۀ خود کار می‌کنند و در نتیجه زمان‌بندی حفظ و نگهداری آن‌ها شاید تحت تاثیر شرایط سیستم قرار گیرد. در این مقاله ما یک فرآیند تصمیم‌گیری مارکوف (MDP) را ارائه می ...




[PDF] Adaptive Maintenance Policies for Aging Devices Using a Markov ... ieeexplore.ieee.org/iel5/59/6563116/06419867.pdf?arnumber=6419867‎Ähnliche Seiten3, AUGUST 2013. Adaptive Maintenance Policies for Aging Devices. Using a Markov Decision Process. Saranga K. Abeygunawardane, Student Member, IEEE, ... [PDF] Adaptive Maintenance Policies for Ageing Devices Using a Markov ... guppy.mpe.nus.edu.sg/~mpexuh/papers/AdaptiveMaintenance.pdf‎Im Cache Ähnliche Seitenmaintenance, Markov decision processes, backward induction and transformers. ... Adaptive asset management policies would also be more economical than ... Adaptive Maintenance Policies for Aging Devices Using a Markov ... https://www.researchgate.net/.../260509363_Adaptive_Maintenance_Policies_for_Aging_Devices_Using_a_Markov_Decision_Process Adaptive Maintenance Policies for Aging Devices Using a Markov Decision Process on ResearchGate, the professional network for scientists. Adaptive maintenance policies for aging devices using a Markov ... scipedia.sellfile.ir/prod-467342-Adaptive+maintenance+policies+for+aging+devices+using+a+Markov+decision+process.html‎Im Cacheمشخصات استنادی. Abeygunawardane, S. K., Jirutitijaroen, P., & Xu, H. (2013). Adaptive maintenance policies for aging devices using a Markov decision process. Adaptive Maintenance Policies for Aging Devices Using a Markov ... www.philadelphia.edu.jo/newlibrary/articls/461.../38333-8874‎Im CacheMaterial Type, Article, Language, English. Title, Adaptive Maintenance Policies for Aging Devices Using a Markov Decision Process. Author(S), Xu Huan (Author ). Optimal control limit policy for age-dependent deteriorating systems ... pio.sagepub.com/content/early/2015/06/30/1748006X15589208.refs 6 Jul 2015 ... A survey of partially observable Markov decision processes: Theory, models ... Adaptive maintenance policies for aging devices using a Markov ... Saranga Kumudu Abeygunawardane - Google Scholar Citations scholar.google.com/citations?user=MaSMbDUAAAAJ&hl=en Senior Lecturer, Department of Electrical Engineering, University of Moratuwa - elect.mrt.ac.lk 10, 2010. Adaptive maintenance policies for aging devices using a Markov decision process. SK Abeygunawardane, P Jirutitijaroen, H Xu. Power Systems, IEEE ... [PDF] Adaptive Feed Rate Policies for Spiral Drilling Using Markov ... https://arxiv.org/pdf/1512.06970‎Im Cacheproposed to have adaptive feed rate choice policy using MDP to identify any significant change in the tool parameters and ...... S.K. Abeygunawardane, P. Jirutitijaroen, and H. Xu, “Adaptive maintenance policies for aging devices using a Markov Decision Process,” IEEE Trans.on Power Systems, vol.28, no.3,. 2013, pp. Dr. Saranga K. Abeygunawardane www.elect.mrt.ac.lk/saranga/saranga_research.html‎Im CacheS. K. Abeygunawardane, P. Jirutitijaroen and H. Xu, "Adaptive maintenance policies for ageing devices using a Markov decision process", IEEE Transactions on ... Optimal decision procedure for an operation-dependent ... dl.acm.org/citation.cfm?id=2816405 1 May 2015 ... Optimal decision procedure for an operation-dependent deteriorating .... Xu H. Adaptive maintenance policies for aging devices using a Markov ...