Search results for: probabilistic roadmap
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 419

Search results for: probabilistic roadmap

149 Infilling Strategies for Surrogate Model Based Multi-disciplinary Analysis and Applications to Velocity Prediction Programs

Authors: Malo Pocheau-Lesteven, Olivier Le Maître

Abstract:

Engineering and optimisation of complex systems is often achieved through multi-disciplinary analysis of the system, where each subsystem is modeled and interacts with other subsystems to model the complete system. The coherence of the output of the different sub-systems is achieved through the use of compatibility constraints, which enforce the coupling between the different subsystems. Due to the complexity of some sub-systems and the computational cost of evaluating their respective models, it is often necessary to build surrogate models of these subsystems to allow repeated evaluation these subsystems at a relatively low computational cost. In this paper, gaussian processes are used, as their probabilistic nature is leveraged to evaluate the likelihood of satisfying the compatibility constraints. This paper presents infilling strategies to build accurate surrogate models of the subsystems in areas where they are likely to meet the compatibility constraint. It is shown that these infilling strategies can reduce the computational cost of building surrogate models for a given level of accuracy. An application of these methods to velocity prediction programs used in offshore racing naval architecture further demonstrates these method's applicability in a real engineering context. Also, some examples of the application of uncertainty quantification to field of naval architecture are presented.

Keywords: infilling strategy, gaussian process, multi disciplinary analysis, velocity prediction program

Procedia PDF Downloads 138
148 Analysis of Bed Load Sediment Transport Mataram-Babarsari Irrigation Canal

Authors: Agatha Padma Laksitaningtyas, Sumiyati Gunawan

Abstract:

Mataram Irrigation Canal has 31,2 km length, is the main irrigation canal in Special Region Province of Yogyakarta, connecting Progo River on the west side and Opak River on the east side. It has an important role as the main water carrier distribution for various purposes such as agriculture, fishery, and plantation which should be free from sediment material. Bed Load Sediment is the basic sediment that will make the sediment process on the irrigation canal. Sediment process is a simultaneous event that can make deposition sediment at the base of irrigation canal and can make the height of elevation water change, it will affect the availability of water to be used for irrigation functions. To predict the amount of drowning sediments in the irrigation canal using two methods: Meyer-Peter and Muller’s Method which is an energy approach method and Einstein Method which is a probabilistic approach. Speed measurement using floating method and using current meters. The channel geometry is measured directly in the field. The basic sediment of the channel is taken in the field by taking three samples from three different points. The result of the research shows that by using the formula Meyer -Peter Muller get the result of 60,75799 kg/s, whereas with Einsten’s Method get result of 13,06461 kg/s. the results may serve as a reference for dredging the sediments on the channel so as not to disrupt the flow of water in irrigation canal.

Keywords: bed load, sediment, irrigation, Mataram canal

Procedia PDF Downloads 202
147 Autonomous Kuka Youbot Navigation Based on Machine Learning and Path Planning

Authors: Carlos Gordon, Patricio Encalada, Henry Lema, Diego Leon, Dennis Chicaiza

Abstract:

The following work presents a proposal of autonomous navigation of mobile robots implemented in an omnidirectional robot Kuka Youbot. We have been able to perform the integration of robotic operative system (ROS) and machine learning algorithms. ROS mainly provides two distributions; ROS hydro and ROS Kinect. ROS hydro allows managing the nodes of odometry, kinematics, and path planning with statistical and probabilistic, global and local algorithms based on Adaptive Monte Carlo Localization (AMCL) and Dijkstra. Meanwhile, ROS Kinect is responsible for the detection block of dynamic objects which can be in the points of the planned trajectory obstructing the path of Kuka Youbot. The detection is managed by artificial vision module under a trained neural network based on the single shot multibox detector system (SSD), where the main dynamic objects for detection are human beings and domestic animals among other objects. When the objects are detected, the system modifies the trajectory or wait for the decision of the dynamic obstacle. Finally, the obstacles are skipped from the planned trajectory, and the Kuka Youbot can reach its goal thanks to the machine learning algorithms.

Keywords: autonomous navigation, machine learning, path planning, robotic operative system, open source computer vision library

Procedia PDF Downloads 154
146 Identifying the True Extend of Glioblastoma Based on Preoperative FLAIR Images

Authors: B. Shukir, L. Szivos, D. Kis, P. Barzo

Abstract:

Glioblastoma is the most malignant brain tumor. In general, the survival rate varies between (14-18) months. Glioblastoma consists a solid and infiltrative part. The standard therapeutic management of glioblastoma is maximum safe resection followed by chemo-radiotherapy. It’s hypothesized that the pretumoral hyperintense region in fluid attenuated inversion recovery (FLAIR) images includes both vasogenic edema and infiltrated tumor cells. In our study, we aimed to define the sensitivity and specificity of hyperintense FLAIR images preoperatively to examine how well it can define the true extent of glioblastoma. (16) glioblastoma patients included in this study. Hyperintense FLAIR region were delineated preoperatively as tumor mask. The infiltrative part of glioblastoma considered the regions where the tumor recurred on the follow up MRI. The recurrence on the CE-T1 images was marked as the recurrence masks. According to (AAL3) and (JHU white matter labels) atlas, the brain divided into cortical and subcortical regions respectively. For calculating specificity and sensitivity, the FLAIR and the recurrence masks overlapped counting how many regions affected by both . The average sensitivity and specificity was 83% and 85% respectively. Individually, the sensitivity and specificity varied between (31-100)%, and (100-58)% respectively. These results suggest that despite FLAIR being as an effective radiologic imaging tool its prognostic value remains controversial and probabilistic tractography remain more reliable available method for identifying the true extent of glioblastoma.

Keywords: brain tumors, glioblastoma, MRI, FLAIR

Procedia PDF Downloads 28
145 Cost Analysis of Neglected Tropical Disease in Nigeria: Implication for Programme Control and Elimination

Authors: Lawong Damian Bernsah

Abstract:

Neglected Tropical Diseases (NTDs) are most predominant among the poor and rural populations and are endemic in 149 countries. These diseases are the most prevalent and responsible for infecting 1.4 billion people worldwide. There are 17 neglected tropical diseases recognized by WHO that constitute the fourth largest disease health and economic burden of all communicable diseases. Five of these 17 diseases are considered for the cost analysis of this paper: lymphatic filariasis, onchocerciasis, trachoma, schistosomiasis, and soil transmitted helminth infections. WHO has proposed a roadmap for eradication and elimination by 2020 and treatments have been donated through the London Declaration by pharmaceutical manufacturers. The paper estimates the cost of NTD control programme and elimination for each NTD disease and total in Nigeria. This is necessary as it forms the bases upon which programme budget and expenditure could be based. Again, given the opportunity cost the resources for NTD face it is necessary to estimate the cost so as to provide bases for comparison. Cost of NTDs control and elimination programme is estimated using the population at risk for each NTD diseases and for the total. The population at risk is gotten from the national master plan for the 2015 - 2020, while the cost per person was gotten for similar studies conducted in similar settings and ranges from US$0.1 to US$0.5 for Mass Administration of Medicine (MAM) and between US$1 to US$1.5 for each NTD disease. The combined cost for all the NTDs was estimated to be US$634.88 million for the period 2015-2020 and US$1.9 billion for each NTD disease for the same period. For the purpose of sensitivity analysis and for robustness of the analysis the cost per person was varied and all were still high. Given that health expenditure for Nigeria (% of GDP) averages 3.5% for the period 1995-2014, it is very clear that efforts have to be made to improve allocation to the health sector in general which is hoped could trickle to NTDs control and elimination. Thus, the government and the donor partners would need to step-up budgetary allocation and also to be aware of the costs of NTD control and elimination programme since they have alternative uses. Key Words: Neglected Tropical Disease, Cost Analysis, NTD Programme Control and Elimination, Cost per Person

Keywords: Neglected Tropical Disease, Cost Analysis, Neglected Tropical Disease Programme Control and Elimination, Cost per Person

Procedia PDF Downloads 253
144 Effect of Floods on Water Quality: A Global Review and Analysis

Authors: Apoorva Bamal, Agnieszka Indiana Olbert

Abstract:

Floods are known to be one of the most devastating hydro-climatic events, impacting a wide range of stakeholders in terms of environmental, social and economic losses. With difference in inundation durations and level of impact, flood hazards are of different degrees and strength. Amongst various set of domains being impacted by floods, environmental degradation in terms of water quality deterioration is one of the majorly effected but less highlighted domains across the world. The degraded water quality is caused by numerous natural and anthropogenic factors that are both point and non-point sources of pollution. Therefore, it is essential to understand the nature and source of the water pollution due to flooding. The major impact of floods is not only on the physico-chemical water quality parameters, but also on the biological elements leading to a vivid influence on the aquatic ecosystem. This deteriorated water quality is impacting many water categories viz. agriculture, drinking water, aquatic habitat, and miscellaneous services requiring an appropriate water quality to survive. This study identifies, reviews, evaluates and assesses multiple researches done across the world to determine the impact of floods on water quality. With a detailed statistical analysis of top relevant researches, this study is a synopsis of the methods used in assessment of impact of floods on water quality in different geographies, and identifying the gaps for further abridgement. As per majority of the studies, different flood magnitudes have varied impact on the water quality parameters leading to either increased or decreased values as compared to the recommended values for various categories. There is also an evident shift of the biological elements in the impacted waters leading to a change in its phenology and inhabitants of the specified water body. This physical, chemical and biological water quality degradation by floods is dependent upon its duration, extent, magnitude and flow direction. Therefore, this research provides an overview into the multiple impacts of floods on water quality, along with a roadmap of way forward to an efficient and uniform linkage of floods and impacted water quality dynamics.

Keywords: floods, statistical analysis, water pollution, water quality

Procedia PDF Downloads 65
143 Estimation and Comparison of Delay at Signalized Intersections Based on Existing Methods

Authors: Arpita Saha, Satish Chandra, Indrajit Ghosh

Abstract:

Delay implicates the time loss of a traveler while crossing an intersection. Efficiency of traffic operation at signalized intersections is assessed in terms of delay caused to an individual vehicle. Highway Capacity Manual (HCM) method and Webster’s method are the most widely used in India for delay estimation purpose. However, in India, traffic is highly heterogeneous in nature with extremely poor lane discipline. Therefore, to explore best delay estimation technique for Indian condition, a comparison was made. In this study, seven signalized intersections from three different cities where chosen. Data was collected for both during morning and evening peak hours. Only under saturated cycles were considered for this study. Delay was estimated based on the field data. With the help of Simpson’s 1/3 rd rule, delay of under saturated cycles was estimated by measuring the area under the curve of queue length and cycle time. Moreover, the field observed delay was compared with the delay estimated using HCM, Webster, Probabilistic, Taylor’s expansion and Regression methods. The drawbacks of the existing delay estimation methods to be use in Indian heterogeneous traffic conditions were figured out, and best method was proposed. It was observed that direct estimation of delay using field measured data is more accurate than existing conventional and modified methods.

Keywords: delay estimation technique, field delay, heterogeneous traffic, signalised intersection

Procedia PDF Downloads 277
142 The Journey from Lean Manufacturing to Industry 4.0: The Rail Manufacturing Process in Mexico

Authors: Diana Flores Galindo, Richard Gil Herrera

Abstract:

Nowadays, Lean Manufacturing and Industry 4.0 are very important in every country. One of the main benefits is continued market presence. It has been identified that there is a need to change existing educational programs, as well as update the knowledge and skills of existing employees. It should be borne in mind that behind each technological improvement, there is a human being. Human talent cannot be neglected. The main objectives of this article are to review the link between Lean Manufacturing, the incorporation of Industry 4.0 and the steps to follow to implement it; analyze the current situation and study the implications and benefits of this new trend, with a particular focus on Mexico. Lean Manufacturing and Industry 4.0 implementation waves must always take care of the most important capital – intellectual capital. The methodology used in this article comprised the following steps: reviewing the reality of the fourth industrial revolution, reviewing employees’ skills on the journey to become world-class, and analyzing the situation in Mexico. Lean Manufacturing and Industry 4.0 were studied not as exclusive concepts, but as complementary ones. The methodological framework used is focused on motivating companies’ collaborators to guarantee common results, innovate, and remain in the market in the face of new requirements from company stakeholders. The key findings were that both trends emphasize the need to improve communication across the entire company and incorporate new technologies into everyday work, from the shop floor to administrative staff, to help improve processes. Taking care of people, activities and processes will bring a company success. In the specific case of Mexico, companies in all sectors need to be aware of and implement technological improvements according to their specific needs. Low-cost labor represents one of the most typical barriers. In conclusion, companies must build a roadmap according to their strategy and needs to achieve their short, medium- and long-term goals.

Keywords: lean management, lean manufacturing, industry 4.0, motivation, SWOT analysis, Hoshin Kanri

Procedia PDF Downloads 126
141 Policy Analysis and Program Evaluation: Need to Designate a Navigable Spatial Identity for Slums Dwellers in India to Maximize Accessibility and Policy Impact

Authors: Resham Badri

Abstract:

Cities today are unable to justify equitable distribution of theirsocio- economic and infrastructural benefits to the marginalized urban poor, and the emergence of a pressing pandemic like COVID-19 has amplified its impact. Lack of identity, vulnerability, and inaccessibility contribute to exclusion. Owing to systemic gaps in institutional processes, urban development policiesfail to represent and cater to the urban poor. This paper aims to be a roadmap for the Indian Government to understand the significance of the designation of a navigable spatial identity to slum dwellers in the form of a digital address, which can form the fundamental basis of identification to enable accessibility to not only basic servicesbut also other utilities. Capitalizing on such a granular and technology backed approach shall allow to target and reach out to the urban poor strategically andaid effective urban governance. This paper adopts a three-pronged approach;(i) Policy analysis- understanding gaps in existing urban policies of India, such as the Pradhan Mantri Awas Yojana, Swachh Bharat Mission, and Adhaar Card policy, (ii) Program Evaluation- analyzing a case study, where slum dwellers in Kolhapur city in India have been provided with navigable addresses using Google Plus Codes and have gained access to basic services, vaccinations, and other emergency deliveries in COVID-19 times, (iii) Policy recommendation. This designation of a navigable spatial identity has tremendous potential to form the foundation on which policies can further base their data collection and service delivery processes to not only provide basic services but also other infrastructural and social welfare initiatives. Hence, a massive window of opportunity lies in addressing the unaddressed to elevate their living standards and respond to their basic needs.

Keywords: policy analysis, urban poor, navigable spatial identity, accessibility

Procedia PDF Downloads 67
140 Ethically Integrating Robots to Assist Elders and Patients with Dementia

Authors: Suresh Lokiah

Abstract:

The emerging trend of integrating robots into elderly care, particularly for assisting patients with dementia, holds the potential to greatly transform the sector. Assisted living facilities, which house a significant number of elderly individuals and dementia patients, constantly strive to engage their residents in stimulating activities. However, due to staffing shortages, they often rely on volunteers to introduce new activities. Despite the availability of social interaction, these residents, frequently overlooked in society, are in desperate need of additional support. Robots designed for elder care are categorized based on their design and functionality. These categories include companion robots, telepresence robots, health monitoring robots, and rehab robots. However, the integration of such robots raises significant ethical concerns, notably regarding privacy, autonomy, and the risk of dehumanization. Privacy issues arise as these robots may need to continually monitor patient activities. There is also a risk of patients becoming overly dependent on these robots, potentially undermining their autonomy. Furthermore, the replacement of human touch with robotic interaction may lead to the dehumanization of care. This paper delves into the ethical considerations of incorporating robotic assistance in eldercare. It proposes a series of guidelines and strategies to ensure the ethical deployment of these robots. These guidelines suggest involving patients in the design and development process of the robots and emphasize the critical need for human oversight to respect the dignity and rights of the elderly and dementia patients. The paper also recommends implementing robust privacy measures, including secure data transmission and data anonymization. In conclusion, this paper offers a thorough examination of the ethical implications of using robotic assistance in elder care. It provides a strategic roadmap to ensure this technology is utilized ethically, thereby maximizing its potential benefits and minimizing any potential harm.

Keywords: human-robot interaction, robots for eldercare, ethics, health, dementia

Procedia PDF Downloads 58
139 Experimental Investigation of Nucleate Pool Boiling Heat Transfer on Laser-Structured Copper Surfaces of Different Patterns

Authors: Luvindran Sugumaran, Mohd Nashrul Mohd Zubir, Kazi Md Salim Newaz, Tuan Zaharinie Tuan Zahari, Suazlan Mt Aznam, Aiman Mohd Halil

Abstract:

With reference to Energy Roadmap 2050, the minimization of greenhouse gas emissions and the enhancement of energy efficiency are the two key factors that could facilitate a radical change in the world's energy infrastructure. However, the energy demands of electronic devices skyrocketed with the advent of the digital age. Currently, the two-phase cooling technique based on phase change pool boiling heat transfer has received a lot of attention because of its potential to fully utilize the latent heat of the fluid and produce a highly effective heat dissipation capacity while keeping the equipment's operating temperature within an acceptable range. There are numerous strategies available for the alteration of heating surfaces, but finding the best, simplest, and most dependable one remains a challenge. Lately, surface texturing via laser ablation has been used in a variety of investigations, demonstrating its significant potential for enhancing the pool boiling heat transfer performance. In this research, the nucleate pool boiling heat transfer performance of laser-structured copper surfaces of different patterns was investigated. The bare copper surface serves as a reference to compare the performance of laser-structured surfaces. It was observed that the heat transfer coefficients were increased with the increase of surface area ratio and the ratio of the peak-to-valley height of the microstructure. Laser machined grain structure produced extra nucleation sites, which ultimately caused the improved pool boiling performance. Due to an increase in nucleation site density and surface area, the enhanced nucleate boiling served as the primary heat transfer mechanism. The pool boiling performance of the laser-structured copper surfaces is superior to the bare copper surface in all aspects.

Keywords: heat transfer coefficient, laser structuring, micro structured surface, pool boiling

Procedia PDF Downloads 58
138 Experimental Investigation of Nucleate Pool Boiling Heat Transfer on Laser-Structured Copper Surfaces of Different Patterns

Authors: Luvindran Sugumaran, Mohd Nashrul Mohd Zubir, Kazi Md Salim Newaz, Tuan Zaharinie Tuan Zahari, Suazlan Mt Aznam, Aiman Mohd Halil

Abstract:

With reference to Energy Roadmap 2050, the minimization of greenhouse gas emissions, and the enhancement of energy efficiency are the two key factors that could facilitate a radical change in the world's energy infrastructure. However, the energy demands of electronic devices skyrocketed with the advent of the digital age. Currently, the two-phase cooling technique based on phase change pool boiling heat transfer has received a lot of attention because of its potential to fully utilize the latent heat of the fluid and produce a highly effective heat dissipation capacity while keeping the equipment's operating temperature within an acceptable range. There are numerous strategies available for the alteration of heating surfaces, but to find the best, simplest, and most dependable one remains a challenge. Lately, surface texturing via laser ablation has been used in a variety of investigations, demonstrating its significant potential for enhancing the pool boiling heat transfer performance. In this research, the nucleate pool boiling heat transfer performance of laser-structured copper surfaces of different patterns was investigated. The bare copper surface serves as a reference to compare the performance of laser-structured surfaces. It was observed that the heat transfer coefficients were increased with the increase of surface area ratio and the ratio of the peak-to-valley height of the microstructure. Laser machined grain structure produced extra nucleation sites, which ultimately caused the improved pool boiling performance. Due to an increase in nucleation site density and surface area, the enhanced nucleate boiling served as the primary heat transfer mechanism. The pool boiling performance of the laser-structured copper surfaces is superior to the bare copper surface in all aspects.

Keywords: heat transfer coefficient, laser structuring, micro structured surface, pool boiling

Procedia PDF Downloads 57
137 Experimental Investigation of Nucleate Pool Boiling Heat Transfer on Laser-Structured Copper Surfaces of Different Patterns

Authors: Luvindran Sugumaran, Mohd Nashrul Mohd Zubir, Kazi Md. Salim Newaz, Tuan Zaharinie Tuan Zahari, Suazlan Mt Aznam, Aiman Mohd Halil

Abstract:

With reference to Energy Roadmap 2050, the minimization of greenhouse gas emissions and the enhancement of energy efficiency are the two key factors that could facilitate a radical change in the world's energy infrastructure. However, the energy demands of electronic devices skyrocketed with the advent of the digital age. Currently, the two-phase cooling technique based on phase change pool boiling heat transfer has received a lot of attention because of its potential to fully utilize the latent heat of the fluid and produce a highly effective heat dissipation capacity while keeping the equipment's operating temperature within an acceptable range. There are numerous strategies available for the alteration of heating surfaces, but to find the best, simplest, and most dependable one remains a challenge. Lately, surface texturing via laser ablation has been used in a variety of investigations, demonstrating its significant potential for enhancing the pool boiling heat transfer performance. In this research, the nucleate pool boiling heat transfer performance of laser-structured copper surfaces of different patterns was investigated. The bare copper surface serves as a reference to compare the performance of laser-structured surfaces. It was observed that the heat transfer coefficients were increased with the increase of surface area ratio and the ratio of the peak-to-valley height of the microstructure. Laser-machined grain structure produced extra nucleation sites, which ultimately caused the improved pool boiling performance. Due to an increase in nucleation site density and surface area, the enhanced nucleate boiling served as the primary heat transfer mechanism. The pool boiling performance of the laser-structured copper surfaces is superior to the bare copper surface in all aspects.

Keywords: heat transfer coefficient, laser structuring, micro structured surface, pool boiling

Procedia PDF Downloads 58
136 Comparative Public Administration: A Case Study of ASEAN Member States

Authors: Nattapol Pourprasert

Abstract:

This research is to study qualitative research having two objectives: 1. to study comparison of private sector of government to compare with ASEAN Member States, 2. to study trend of private enterprise administration of ASEAN Member States. The results are: (1) Thai people focus on personal resource administrative system, (2) Indonesia focuses on official system by good administrative principles, (3) Malaysia focuses on technology development to service people, (4) Philippines focuses on operation system development, (5) Singapore focuses on public service development, (6) Brunei Darussalam focuses on equality in government service of people, (7) Vietnam focuses on creating government labor base and develop testing and administration of operation test, (8) Myanmar focuses on human resources development, (9) Laos focuses on form of local administration, (10) Cambodia focuses on policy revolution in personal resources. The result of the second part of the study are: (1) Thailand created government personnel to be power under qualitative official structural event, (2) Indonesia has Bureaucracy Reform Roadmap of Bureaucracy Reform and National Development Plan Medium Term, (3) Malaysia has database for people service, (4) Philippines follows up control of units operation by government policy, (5) Singapore created reliability, participation of people to set government policy people’s demand, (6) Brunei Darussalam has social welfare to people, (7) Vietnam revolved testing system and administration including manpower base construction of government effectively, (8) Myanmar creates high rank administrators to develop country, (9) Laos distributes power to locality, and (10) Cambodia revolved personnel resource policy.

Keywords: public administration development, ASEAN member states, private sector, government

Procedia PDF Downloads 236
135 A Methodological Approach to Digital Engineering Adoption and Implementation for Organizations

Authors: Sadia H. Syeda, Zain H. Malik

Abstract:

As systems continue to become more complex and the interdependencies of processes and sub-systems continue to grow and transform, the need for a comprehensive method of tracking and linking the lifecycle of the systems in a digital form becomes ever more critical. Digital Engineering (DE) provides an approach to managing an authoritative data source that links, tracks, and updates system data as it evolves and grows throughout the system development lifecycle. DE enables the developing, tracking, and sharing system data, models, and other related artifacts in a digital environment accessible to all necessary stakeholders. The DE environment provides an integrated electronic repository that enables traceability between design, engineering, and sustainment artifacts. The DE activities' primary objective is to develop a set of integrated, coherent, and consistent system models for the program. It is envisioned to provide a collaborative information-sharing environment for various stakeholders, including operational users, acquisition personnel, engineering personnel, and logistics and sustainment personnel. Examining the processes that DE can support in the systems engineering life cycle (SELC) is a primary step in the DE adoption and implementation journey. Through an analysis of the U.S Department of Defense’s (DoD) Office of the Secretary of Defense (OSD’s) Digital Engineering Strategy and their implementation, examples of DE implementation by the industry and technical organizations, this paper will provide descriptions of the current DE processes and best practices of implementing DE across an enterprise. This will help identify the capabilities, environment, and infrastructure needed to develop a potential roadmap for implementing DE practices consistent with its business strategy. A capability maturity matrix will be provided to assess the organization’s DE maturity emphasizing how all the SELC elements interlink to form a cohesive ecosystem. If implemented, DE can increase efficiency and improve the systems engineering processes' quality and outcomes.

Keywords: digital engineering, digital environment, digital maturity model, single source of truth, systems engineering life-cycle

Procedia PDF Downloads 73
134 A Monte Carlo Fuzzy Logistic Regression Framework against Imbalance and Separation

Authors: Georgios Charizanos, Haydar Demirhan, Duygu Icen

Abstract:

Two of the most impactful issues in classical logistic regression are class imbalance and complete separation. These can result in model predictions heavily leaning towards the imbalanced class on the binary response variable or over-fitting issues. Fuzzy methodology offers key solutions for handling these problems. However, most studies propose the transformation of the binary responses into a continuous format limited within [0,1]. This is called the possibilistic approach within fuzzy logistic regression. Following this approach is more aligned with straightforward regression since a logit-link function is not utilized, and fuzzy probabilities are not generated. In contrast, we propose a method of fuzzifying binary response variables that allows for the use of the logit-link function; hence, a probabilistic fuzzy logistic regression model with the Monte Carlo method. The fuzzy probabilities are then classified by selecting a fuzzy threshold. Different combinations of fuzzy and crisp input, output, and coefficients are explored, aiming to understand which of these perform better under different conditions of imbalance and separation. We conduct numerical experiments using both synthetic and real datasets to demonstrate the performance of the fuzzy logistic regression framework against seven crisp machine learning methods. The proposed framework shows better performance irrespective of the degree of imbalance and presence of separation in the data, while the considered machine learning methods are significantly impacted.

Keywords: fuzzy logistic regression, fuzzy, logistic, machine learning

Procedia PDF Downloads 46
133 Smart Web Services in the Web of Things

Authors: Sekkal Nawel

Abstract:

The Web of Things (WoT), integration of smart technologies from the Internet or network to Web architecture or application, is becoming more complex, larger, and dynamic. The WoT is associated with various elements such as sensors, devices, networks, protocols, data, functionalities, and architectures to perform services for stakeholders. These services operate in the context of the interaction of stakeholders and the WoT elements. Such context is becoming a key information source from which data are of various nature and uncertain, thus leading to complex situations. In this paper, we take interest in the development of intelligent Web services. The key ingredients of this “intelligent” notion are the context diversity, the necessity of a semantic representation to manage complex situations and the capacity to reason with uncertain data. In this perspective, we introduce a multi-layered architecture based on a generic intelligent Web service model dealing with various contexts, which proactively predict future situations and reactively respond to real-time situations in order to support decision-making. For semantic context data representation, we use PR-OWL, which is a probabilistic ontology based on Multi-Entity Bayesian Networks (MEBN). PR-OWL is flexible enough to represent complex, dynamic, and uncertain contexts, the key requirements of the development for the intelligent Web services. A case study was carried out using the proposed architecture for intelligent plant watering to show the role of proactive and reactive contextual reasoning in terms of WoT.

Keywords: smart web service, the web of things, context reasoning, proactive, reactive, multi-entity bayesian networks, PR-OWL

Procedia PDF Downloads 43
132 Advanced Combinatorial Method for Solving Complex Fault Trees

Authors: José de Jesús Rivero Oliva, Jesús Salomón Llanes, Manuel Perdomo Ojeda, Antonio Torres Valle

Abstract:

Combinatorial explosion is a common problem to both predominant methods for solving fault trees: Minimal Cut Set (MCS) approach and Binary Decision Diagram (BDD). High memory consumption impedes the complete solution of very complex fault trees. Only approximated non-conservative solutions are possible in these cases using truncation or other simplification techniques. The paper proposes a method (CSolv+) for solving complex fault trees, without any possibility of combinatorial explosion. Each individual MCS is immediately discarded after its contribution to the basic events importance measures and the Top gate Upper Bound Probability (TUBP) has been accounted. An estimation of the Top gate Exact Probability (TEP) is also provided. Therefore, running in a computer cluster, CSolv+ will guarantee the complete solution of complex fault trees. It was successfully applied to 40 fault trees from the Aralia fault trees database, performing the evaluation of the top gate probability, the 1000 Significant MCSs (SMCS), and the Fussell-Vesely, RRW and RAW importance measures for all basic events. The high complexity fault tree nus9601 was solved with truncation probabilities from 10-²¹ to 10-²⁷ just to limit the execution time. The solution corresponding to 10-²⁷ evaluated 3.530.592.796 MCSs in 3 hours and 15 minutes.

Keywords: system reliability analysis, probabilistic risk assessment, fault tree analysis, basic events importance measures

Procedia PDF Downloads 14
131 Julia-Based Computational Tool for Composite System Reliability Assessment

Authors: Josif Figueroa, Kush Bubbar, Greg Young-Morris

Abstract:

The reliability evaluation of composite generation and bulk transmission systems is crucial for ensuring a reliable supply of electrical energy to significant system load points. However, evaluating adequacy indices using probabilistic methods like sequential Monte Carlo Simulation can be computationally expensive. Despite this, it is necessary when time-varying and interdependent resources, such as renewables and energy storage systems, are involved. Recent advances in solving power network optimization problems and parallel computing have improved runtime performance while maintaining solution accuracy. This work introduces CompositeSystems, an open-source Composite System Reliability Evaluation tool developed in Julia™, to address the current deficiencies of commercial and non-commercial tools. This work introduces its design, validation, and effectiveness, which includes analyzing two different formulations of the Optimal Power Flow problem. The simulations demonstrate excellent agreement with existing published studies while improving replicability and reproducibility. Overall, the proposed tool can provide valuable insights into the performance of transmission systems, making it an important addition to the existing toolbox for power system planning.

Keywords: open-source software, composite system reliability, optimization methods, Monte Carlo methods, optimal power flow

Procedia PDF Downloads 50
130 Probabilistic Approach of Dealing with Uncertainties in Distributed Constraint Optimization Problems and Situation Awareness for Multi-agent Systems

Authors: Sagir M. Yusuf, Chris Baber

Abstract:

In this paper, we describe how Bayesian inferential reasoning will contributes in obtaining a well-satisfied prediction for Distributed Constraint Optimization Problems (DCOPs) with uncertainties. We also demonstrate how DCOPs could be merged to multi-agent knowledge understand and prediction (i.e. Situation Awareness). The DCOPs functions were merged with Bayesian Belief Network (BBN) in the form of situation, awareness, and utility nodes. We describe how the uncertainties can be represented to the BBN and make an effective prediction using the expectation-maximization algorithm or conjugate gradient descent algorithm. The idea of variable prediction using Bayesian inference may reduce the number of variables in agents’ sampling domain and also allow missing variables estimations. Experiment results proved that the BBN perform compelling predictions with samples containing uncertainties than the perfect samples. That is, Bayesian inference can help in handling uncertainties and dynamism of DCOPs, which is the current issue in the DCOPs community. We show how Bayesian inference could be formalized with Distributed Situation Awareness (DSA) using uncertain and missing agents’ data. The whole framework was tested on multi-UAV mission for forest fire searching. Future work focuses on augmenting existing architecture to deal with dynamic DCOPs algorithms and multi-agent information merging.

Keywords: DCOP, multi-agent reasoning, Bayesian reasoning, swarm intelligence

Procedia PDF Downloads 98
129 Irish Film Tourism, Neocolonialism and Star Wars: Charting a Course Towards Ecologically and Culturally Considered Representation and Tourism on Skellig Michael

Authors: Rachel Gough

Abstract:

In 2014, Skellig Michael, an island off Ireland’s western seaboard and UNESCO world heritage site became a major setting in Disney’s Star Wars franchise. The subsequent influx of tourists to the site has proven to be a point of contention nationally. The increased visitor numbers have uplifted certain areas of the local economy, the mainland, but have caused irreparable damage to historic monuments and to endangered bird populations who breed on the island. Recent research carried out by a state body suggests far-reaching and longterm negative impacts on the island’s culture and environment, should the association with the Star Wars franchise persist. In spite of this, the film has been widely endorsed by the Irish government as providing a vital economic boost to historically marginalised rural areas through film tourism. This paper argues quite plainly that what is taking place on Skellig is neocolonialism. Skellig Michael’s unique resources, its aesthetic qualities, its ecosystem, and its cultural currency have been sold by the state to a multinational corporation, who profit from their use. Meanwhile, locals are left to do their best to turn a market trend into sustainable business at the expense of culture ecology and community. This paper intends to be the first dedicated study into the psychogeographic and cultural impact of Skellig Michael’s deterioration as a result of film tourism. It will discuss the projected impact of this incident on Irish culture more broadly and finally will attempt to lay out a roadmap for more collaborative filmmaking and touristic approach, which allows local cultures and ecosystem’s to thrive without drastically inhibiting cultural production. This paper will ultimately find that the consequences of this representation call for a requirement to read tourism as a split concept — namely into what we might loosely call “eco-tourism” and more capital-based “profit-bottom-line tourism.”

Keywords: ecology, film tourism, neocolonialism, sustainability

Procedia PDF Downloads 177
128 Optimization of Air Pollution Control Model for Mining

Authors: Zunaira Asif, Zhi Chen

Abstract:

The sustainable measures on air quality management are recognized as one of the most serious environmental concerns in the mining region. The mining operations emit various types of pollutants which have significant impacts on the environment. This study presents a stochastic control strategy by developing the air pollution control model to achieve a cost-effective solution. The optimization method is formulated to predict the cost of treatment using linear programming with an objective function and multi-constraints. The constraints mainly focus on two factors which are: production of metal should not exceed the available resources, and air quality should meet the standard criteria of the pollutant. The applicability of this model is explored through a case study of an open pit metal mine, Utah, USA. This method simultaneously uses meteorological data as a dispersion transfer function to support the practical local conditions. The probabilistic analysis and the uncertainties in the meteorological conditions are accomplished by Monte Carlo simulation. Reasonable results have been obtained to select the optimized treatment technology for PM2.5, PM10, NOx, and SO2. Additional comparison analysis shows that baghouse is the least cost option as compared to electrostatic precipitator and wet scrubbers for particulate matter, whereas non-selective catalytical reduction and dry-flue gas desulfurization are suitable for NOx and SO2 reduction respectively. Thus, this model can aid planners to reduce these pollutants at a marginal cost by suggesting control pollution devices, while accounting for dynamic meteorological conditions and mining activities.

Keywords: air pollution, linear programming, mining, optimization, treatment technologies

Procedia PDF Downloads 184
127 Epigenetic Modifying Potential of Dietary Spices: Link to Cure Complex Diseases

Authors: Jeena Gupta

Abstract:

In the today’s world of pharmaceutical products, one should not forget the healing properties of inexpensive food materials especially spices. They are known to possess hidden pharmaceutical ingredients, imparting them the qualities of being anti-microbial, anti-oxidant, anti-inflammatory and anti-carcinogenic. Further aberrant epigenetic regulatory mechanisms like DNA methylation, histone modifications or altered microRNA expression patterns, which regulates gene expression without changing DNA sequence, contribute significantly in the development of various diseases. Changing lifestyles and diets exert their effect by influencing these epigenetic mechanisms which are thus the target of dietary phytochemicals. Bioactive components of plants have been in use since ages but their potential to reverse epigenetic alterations and prevention against diseases is yet to be explored. Spices being rich repositories of many bioactive constituents are responsible for providing them unique aroma and taste. Some spices like curcuma and garlic have been well evaluated for their epigenetic regulatory potential, but for others, it is largely unknown. We have evaluated the biological activity of phyto-active components of Fennel, Cardamom and Fenugreek by in silico molecular modeling, in vitro and in vivo studies. Ligand-based similarity studies were conducted to identify structurally similar compounds to understand their biological phenomenon. The database searching has been done by using Fenchone from fennel, Sabinene from cardamom and protodioscin from fenugreek as a query molecule in the different small molecule databases. Moreover, the results of the database searching exhibited that these compounds are having potential binding with the different targets found in the Protein Data Bank. Further in addition to being epigenetic modifiers, in vitro study had demonstrated the antimicrobial, antifungal, antioxidant and cytotoxicity protective effects of Fenchone, Sabinene and Protodioscin. To best of our knowledge, such type of studies facilitate the target fishing as well as making the roadmap in drug design and discovery process for identification of novel therapeutics.

Keywords: epigenetics, spices, phytochemicals, fenchone

Procedia PDF Downloads 137
126 From Poverty to Progress: A Comparative Analysis of Mongolia with PEER Countries

Authors: Yude Wu

Abstract:

Mongolia, grappling with significant socio-economic challenges, faces pressing issues of inequality and poverty, as evidenced by a high Gini coefficient and the highest poverty rate among the top 20 largest Asian countries. Despite government efforts, Mongolia's poverty rate experienced only a slight reduction from 29.6 percent in 2016 to 27.8 percent in 2020. PEER countries, such as South Africa, Botswana, Kazakhstan, and Peru, share characteristics with Mongolia, including reliance on the mining industry and classification as lower middle-income countries. Successful transitions of these countries to upper middle-income status between 1994 and the 2010s provide valuable insights. Drawing on secondary analyses of existing research and PEER country profiles, the study evaluates past policies, identifies gaps in current approaches, and proposes recommendations to combat poverty sustainably. The hypothesis includes a reliance on the mining industry and a transition from lower to upper middle-income status. Policies from these countries, such as the GEAR policy in South Africa and economic diversification in Botswana, offer insights into Mongolia's development. This essay aims to illuminate the multidimensional nature of underdevelopment in Mongolia through a secondary analysis of existing research and PEER country profiles, evaluating past policies, identifying gaps in current approaches, and providing recommendations for sustainable progress. Drawing inspiration from PEER countries, Mongolia can implement policies such as economic diversification to reduce vulnerability and create stable job opportunities. Emphasis on infrastructure, human capital, and strategic partnerships for Foreign Direct Investment (FDI) aligns with successful strategies implemented by PEER countries, providing a roadmap for Mongolia's development objectives.

Keywords: inequality, PEER countries, comparative analysis, nomadic animal husbandry, sustainable growth

Procedia PDF Downloads 47
125 Cyber-Social Networks in Preventing Terrorism: Topological Scope

Authors: Alessandra Rossodivita, Alexei Tikhomirov, Andrey Trufanov, Nikolay Kinash, Olga Berestneva, Svetlana Nikitina, Fabio Casati, Alessandro Visconti, Tommaso Saporito

Abstract:

It is well known that world and national societies are exposed to diverse threats: anthropogenic, technological, and natural. Anthropogenic ones are of greater risks and, thus, attract special interest to researchers within wide spectrum of disciplines in efforts to lower the pertinent risks. Some researchers showed by means of multilayered, complex network models how media promotes the prevention of disease spread. To go further, not only are mass-media sources included in scope the paper suggests but also personificated social bots (socbots) linked according to reflexive theory. The novel scope considers information spread over conscious and unconscious agents while counteracting both natural and man-made threats, i.e., infections and terrorist hazards. Contrary to numerous publications on misinformation disseminated by ‘bad’ bots within social networks, this study focuses on ‘good’ bots, which should be mobilized to counter the former ones. These social bots deployed mixture with real social actors that are engaged in concerted actions at spreading, receiving and analyzing information. All the contemporary complex network platforms (multiplexes, interdependent networks, combined stem networks et al.) are comprised to describe and test socbots activities within competing information sharing tools, namely mass-media hubs, social networks, messengers, and e-mail at all phases of disasters. The scope and concomitant techniques present evidence that embedding such socbots into information sharing process crucially change the network topology of actor interactions. The change might improve or impair robustness of social network environment: it depends on who and how controls the socbots. It is demonstrated that the topological approach elucidates techno-social processes within the field and outline the roadmap to a safer world.

Keywords: complex network platform, counterterrorism, information sharing topology, social bots

Procedia PDF Downloads 140
124 Industry 4.0 Platforms as 'Cluster' ecosystems for small and medium enterprises (SMEs)

Authors: Vivek Anand, Rainer Naegele

Abstract:

Industry 4.0 is a global mega-trend revolutionizing the world of advanced manufacturing, but also bringing up challenges for SMEs. In response, many regional, as well as digital Industry 4.0 Platforms, have been set up to boost the competencies of established enterprises as well as SMEs. The concept of 'Clusters' is a policy tool that aims to be a starting point to establish sustainable and self-supporting structures in industries of a region by identifying competencies and supporting cluster actors with services that match their growth needs. This paper is motivated by the idea that Clusters have the potential to enable firms, particularly SMEs, to accelerate the innovation process and transition to digital technologies. In this research, the efficacy of Industry 4.0 platforms as Cluster ecosystems is evaluated, especially for SMEs. Focusing on the Baden Wurttemberg region in Germany, an action research method is employed to study how SMEs leverage other actors on Industry 4.0 Platforms to further their Industry 4.0 journeys. The aim is to evaluate how such Industry 4.0 platforms stimulate innovation, cooperation and competitiveness. Additionally, the barriers to these platforms fulfilling their promise to serve as capacity building cluster ecosystems for SMEs in a region will also be identified. The findings will be helpful for academicians and policymakers alike, who can leverage a ‘cluster policy’ to enable Industry 4.0 ecosystems in their regions. Furthermore, relevant management and policy implications stem from the analysis. This will also be of interest to the various players in a cluster ecosystem - like SMEs and service providers - who benefit from the cooperation and competition. The paper will improve the understanding of how a dialogue orientation, a bottom-up approach and active integration of all involved cluster actors enhance the potential of Industry 4.0 Platforms. A strong collaborative culture is a key driver of digital transformation and technology adoption across sectors, value chains and supply chains; and will position Industry 4.0 Platforms at the forefront of the industrial renaissance. Motivated by this argument and based on the results of the qualitative research, a roadmap will be proposed to position Industry 4.0 Platforms as effective clusters ecosystems to support Industry 4.0 adoption in a region.

Keywords: cluster policy, digital transformation, industry 4.0, innovation clusters, innovation policy, SMEs and startups

Procedia PDF Downloads 203
123 Computational Identification of Signalling Pathways in Protein Interaction Networks

Authors: Angela U. Makolo, Temitayo A. Olagunju

Abstract:

The knowledge of signaling pathways is central to understanding the biological mechanisms of organisms since it has been identified that in eukaryotic organisms, the number of signaling pathways determines the number of ways the organism will react to external stimuli. Signaling pathways are studied using protein interaction networks constructed from protein-protein interaction data obtained using high throughput experimental procedures. However, these high throughput methods are known to produce very high rates of false positive and negative interactions. In order to construct a useful protein interaction network from this noisy data, computational methods are applied to validate the protein-protein interactions. In this study, a computational technique to identify signaling pathways from a protein interaction network constructed using validated protein-protein interaction data was designed. A weighted interaction graph of the Saccharomyces cerevisiae (Baker’s Yeast) organism using the proteins as the nodes and interactions between them as edges was constructed. The weights were obtained using Bayesian probabilistic network to estimate the posterior probability of interaction between two proteins given the gene expression measurement as biological evidence. Only interactions above a threshold were accepted for the network model. A pathway was formalized as a simple path in the interaction network from a starting protein and an ending protein of interest. We were able to identify some pathway segments, one of which is a segment of the pathway that signals the start of the process of meiosis in S. cerevisiae.

Keywords: Bayesian networks, protein interaction networks, Saccharomyces cerevisiae, signalling pathways

Procedia PDF Downloads 517
122 Advanced Numerical and Analytical Methods for Assessing Concrete Sewers and Their Remaining Service Life

Authors: Amir Alani, Mojtaba Mahmoodian, Anna Romanova, Asaad Faramarzi

Abstract:

Pipelines are extensively used engineering structures which convey fluid from one place to another. Most of the time, pipelines are placed underground and are encumbered by soil weight and traffic loads. Corrosion of pipe material is the most common form of pipeline deterioration and should be considered in both the strength and serviceability analysis of pipes. The study in this research focuses on concrete pipes in sewage systems (concrete sewers). This research firstly investigates how to involve the effect of corrosion as a time dependent process of deterioration in the structural and failure analysis of this type of pipe. Then three probabilistic time dependent reliability analysis methods including the first passage probability theory, the gamma distributed degradation model and the Monte Carlo simulation technique are discussed and developed. Sensitivity analysis indexes which can be used to identify the most important parameters that affect pipe failure are also discussed. The reliability analysis methods developed in this paper contribute as rational tools for decision makers with regard to the strengthening and rehabilitation of existing pipelines. The results can be used to obtain a cost-effective strategy for the management of the sewer system.

Keywords: reliability analysis, service life prediction, Monte Carlo simulation method, first passage probability theory, gamma distributed degradation model

Procedia PDF Downloads 432
121 Calibration of Resistance Factors for Reliability-Based Design of Driven Piles Considering Unsaturated Soil Effects

Authors: Mohammad Amin Tutunchian, Pedram Roshani, Reza Rezvani, Julio Ángel Infante Sedano

Abstract:

The highly recommended approach to design, known as the load and resistance factor design (LRFD) method, employs the geotechnical resistance factor (GRF) for shaping pile foundation designs. Within the standard process for designing pile foundations, geotechnical engineers commonly adopt a design strategy rooted in saturated soil mechanics (SSM), often disregarding the impact of unsaturated soil behavior. This oversight within the design procedure leads to the omission of the enhancement in shear strength exhibited by unsaturated soils, resulting in a more cautious outcome in design results. This research endeavors to present a methodology for fine-tuning the GRF used for axially loaded driven piles in Winnipeg, Canada. This is achieved through the application of a well-established probabilistic approach known as the first-order second moment (FOSM) method while also accounting for the influence of unsaturated soil behavior. The findings of this study demonstrate that incorporating the influence of unsaturated conditions yields an elevation in projected bearing capacity and recommends higher GRF values in accordance with established codes. Additionally, a novel factor referred to as phy has been introduced to encompass the impact of saturation conditions in the calculation of pile bearing capacity, as guided by prevalent static analysis techniques.

Keywords: unsaturated soils, shear strength, LRFD, FOSM, GRF

Procedia PDF Downloads 71
120 Influences of Slope Inclination on the Storage Capacity and Stability of Municipal Solid Waste Landfills

Authors: Feten Chihi, Gabriella Varga

Abstract:

The world's most prevalent waste management strategy is landfills. However, it grew more difficult due to a lack of acceptable waste sites. In order to develop larger landfills and extend their lifespan, the purpose of this article is to expand the capacity of the construction by varying the slope's inclination and to examine its effect on the safety factor. The capacity change with tilt is mathematically determined. Using a new probabilistic calculation method that takes into account the heterogeneity of waste layers, the safety factor for various slope angles is examined. To assess the effect of slope variation on the overall safety of landfills, over a hundred computations were performed for each angle. It has been shown that capacity increases significantly with increasing inclination. Passing from 1:3 to 2:3 slope angles and from 1:3 to 1:2 slope angles, the volume of garbage that can be deposited increases by 40 percent and 25 percent, respectively, of the initial volume. The results of the safety factor indicate that slopes of 1:3 and 1:2 are safe when the standard method (homogenous waste) is used for computation. Using the new approaches, a slope with an inclination of 2:3 can be deemed safe, despite the fact that the calculation does not account for the safety-enhancing effect of daily cover layers. Based on the study reported in this paper, the malty layered nonhomogeneous calculating technique better characterizes the safety factor. As it more closely resembles the actual state of landfills, the employed technique allows for more flexibility in design parameters. This work represents a substantial advance in limiting both safe and economical landfills.

Keywords: landfill, municipal solid waste, slope inclination, capacity, safety factor

Procedia PDF Downloads 172