Search results for: probability bivariant models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7613

Search results for: probability bivariant models

7373 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses

Authors: Neil Bar, Andrew Heweston

Abstract:

Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.

Keywords: probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability

Procedia PDF Downloads 204
7372 Decision Making in Medicine and Treatment Strategies

Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi

Abstract:

Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.

Keywords: decision making, medicine, treatment strategies, patient

Procedia PDF Downloads 573
7371 Optimization of Flexible Job Shop Scheduling Problem with Sequence-Dependent Setup Times Using Genetic Algorithm Approach

Authors: Sanjay Kumar Parjapati, Ajai Jain

Abstract:

This paper presents optimization of makespan for ‘n’ jobs and ‘m’ machines flexible job shop scheduling problem with sequence dependent setup time using genetic algorithm (GA) approach. A restart scheme has also been applied to prevent the premature convergence. Two case studies are taken into consideration. Results are obtained by considering crossover probability (pc = 0.85) and mutation probability (pm = 0.15). Five simulation runs for each case study are taken and minimum value among them is taken as optimal makespan. Results indicate that optimal makespan can be achieved with more than one sequence of jobs in a production order.

Keywords: flexible job shop, genetic algorithm, makespan, sequence dependent setup times

Procedia PDF Downloads 323
7370 Enhancing Signal Reception in a Mobile Radio Network Using Adaptive Beamforming Antenna Arrays Technology

Authors: Ugwu O. C., Mamah R. O., Awudu W. S.

Abstract:

This work is aimed at enhancing signal reception on a mobile radio network and minimizing outage probability in a mobile radio network using adaptive beamforming antenna arrays. In this research work, an empirical real-time drive measurement was done in a cellular network of Globalcom Nigeria Limited located at Ikeja, the headquarters of Lagos State, Nigeria, with reference base station number KJA 004. The empirical measurement includes Received Signal Strength and Bit Error Rate which were recorded for exact prediction of the signal strength of the network as at the time of carrying out this research work. The Received Signal Strength and Bit Error Rate were measured with a spectrum monitoring Van with the help of a Ray Tracer at an interval of 100 meters up to 700 meters from the transmitting base station. The distance and angular location measurements from the reference network were done with the help Global Positioning System (GPS). The other equipment used were transmitting equipment measurements software (Temsoftware), Laptops and log files, which showed received signal strength with distance from the base station. Results obtained were about 11% from the real-time experiment, which showed that mobile radio networks are prone to signal failure and can be minimized using an Adaptive Beamforming Antenna Array in terms of a significant reduction in Bit Error Rate, which implies improved performance of the mobile radio network. In addition, this work did not only include experiments done through empirical measurement but also enhanced mathematical models that were developed and implemented as a reference model for accurate prediction. The proposed signal models were based on the analysis of continuous time and discrete space, and some other assumptions. These developed (proposed) enhanced models were validated using MATLAB (version 7.6.3.35) program and compared with the conventional antenna for accuracy. These outage models were used to manage the blocked call experience in the mobile radio network. 20% improvement was obtained when the adaptive beamforming antenna arrays were implemented on the wireless mobile radio network.

Keywords: beamforming algorithm, adaptive beamforming, simulink, reception

Procedia PDF Downloads 29
7369 Unsupervised Reciter Recognition Using Gaussian Mixture Models

Authors: Ahmad Alwosheel, Ahmed Alqaraawi

Abstract:

This work proposes an unsupervised text-independent probabilistic approach to recognize Quran reciter voice. It is an accurate approach that works on real time applications. This approach does not require a prior information about reciter models. It has two phases, where in the training phase the reciters' acoustical features are modeled using Gaussian Mixture Models, while in the testing phase, unlabeled reciter's acoustical features are examined among GMM models. Using this approach, a high accuracy results are achieved with efficient computation time process.

Keywords: Quran, speaker recognition, reciter recognition, Gaussian Mixture Model

Procedia PDF Downloads 375
7368 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models

Authors: Jay L. Fu

Abstract:

Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.

Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction

Procedia PDF Downloads 138
7367 Fuzzy Set Qualitative Comparative Analysis in Business Models' Study

Authors: K. Debkowska

Abstract:

The aim of this article is presenting the possibilities of using Fuzzy Set Qualitative Comparative Analysis (fsQCA) in researches concerning business models of enterprises. FsQCA is a bridge between quantitative and qualitative researches. It's potential can be used in analysis and evaluation of business models. The article presents the results of a study conducted on the basis of enterprises belonging to different sectors: transport and logistics, industry, building construction, and trade. The enterprises have been researched taking into account the components of business models and the financial condition of companies. Business models are areas of complex and heterogeneous nature. The use of fsQCA has enabled to answer the following question: which components of a business model and in which configuration influence better financial condition of enterprises. The analysis has been performed separately for particular sectors. This enabled to compare the combinations of business models' components which actively influence the financial condition of enterprises in analyzed sectors. The following components of business models were analyzed for the purposes of the study: Key Partners, Key Activities, Key Resources, Value Proposition, Channels, Cost Structure, Revenue Streams, Customer Segment and Customer Relationships. These components of the study constituted the variables shaping the financial results of enterprises. The results of the study lead us to believe that fsQCA can help in analyzing and evaluating a business model, which is important in terms of making a business decision about the business model used or its change. In addition, results obtained by fsQCA can be applied by all stakeholders connected with the company.

Keywords: business models, components of business models, data analysis, fsQCA

Procedia PDF Downloads 169
7366 Spatio-Temporal Pest Risk Analysis with ‘BioClass’

Authors: Vladimir A. Todiras

Abstract:

Spatio-temporal models provide new possibilities for real-time action in pest risk analysis. It should be noted that estimation of the possibility and probability of introduction of a pest and of its economic consequences involves many uncertainties. We present a new mapping technique that assesses pest invasion risk using online BioClass software. BioClass is a GIS tool designed to solve multiple-criteria classification and optimization problems based on fuzzy logic and level set methods. This research describes a method for predicting the potential establishment and spread of a plant pest into new areas using a case study: corn rootworm (Diabrotica spp.), tomato leaf miner (Tuta absoluta) and plum fruit moth (Grapholita funebrana). Our study demonstrated that in BioClass we can combine fuzzy logic and geographic information systems with knowledge of pest biology and environmental data to derive new information for decision making. Pests are sensitive to a warming climate, as temperature greatly affects their survival and reproductive rate and capacity. Changes have been observed in the distribution, frequency and severity of outbreaks of Helicoverpa armigera on tomato. BioClass has demonstrated to be a powerful tool for applying dynamic models and map the potential future distribution of a species, enable resource to make decisions about dangerous and invasive species management and control.

Keywords: classification, model, pest, risk

Procedia PDF Downloads 278
7365 Formal Models of Sanitary Inspections Teams Activities

Authors: Tadeusz Nowicki, Radosław Pytlak, Robert Waszkowski, Jerzy Bertrandt, Anna Kłos

Abstract:

This paper presents methods for formal modeling of activities in the area of sanitary inspectors outbreak of food-borne diseases. The models allow you to measure the characteristics of the activities of sanitary inspection and as a result allow improving the performance of sanitary services and thus food security.

Keywords: food-borne disease, epidemic, sanitary inspection, mathematical models

Procedia PDF Downloads 298
7364 Contingency Screening Using Risk Factor Considering Transmission Line Outage

Authors: M. Marsadek, A. Mohamed

Abstract:

Power system security analysis is the most time demanding process due to large number of possible contingencies that need to be analyzed.  In a power system, any contingency resulting in security violation such as line overload or low voltage may occur for a number of reasons at any time.  To efficiently rank a contingency, both probability and the extent of security violation must be considered so as not to underestimate the risk associated with the contingency. This paper proposed a contingency ranking method that take into account the probabilistic nature of power system and the severity of contingency by using a newly developed method based on risk factor.  The proposed technique is implemented on IEEE 24-bus system.

Keywords: line overload, low voltage, probability, risk factor, severity

Procedia PDF Downloads 541
7363 Evaluation of Parameters of Subject Models and Their Mutual Effects

Authors: A. G. Kovalenko, Y. N. Amirgaliyev, A. U. Kalizhanova, L. S. Balgabayeva, A. H. Kozbakova, Z. S. Aitkulov

Abstract:

It is known that statistical information on operation of the compound multisite system is often far from the description of actual state of the system and does not allow drawing any conclusions about the correctness of its operation. For example, from the world practice of operation of systems of water supply, water disposal, it is known that total measurements at consumers and at suppliers differ between 40-60%. It is connected with mathematical measure of inaccuracy as well as ineffective running of corresponding systems. Analysis of widely-distributed systems is more difficult, in which subjects, which are self-maintained in decision-making, carry out economic interaction in production, act of purchase and sale, resale and consumption. This work analyzed mathematical models of sellers, consumers, arbitragers and the models of their interaction in the provision of dispersed single-product market of perfect competition. On the basis of these models, the methods, allowing estimation of every subject’s operating options and systems as a whole are given.

Keywords: dispersed systems, models, hydraulic network, algorithms

Procedia PDF Downloads 279
7362 Poverty Dynamics in Thailand: Evidence from Household Panel Data

Authors: Nattabhorn Leamcharaskul

Abstract:

This study aims to examine determining factors of the dynamics of poverty in Thailand by using panel data of 3,567 households in 2007-2017. Four techniques of estimation are employed to analyze the situation of poverty across households and time periods: the multinomial logit model, the sequential logit model, the quantile regression model, and the difference in difference model. Households are categorized based on their experiences into 5 groups, namely chronically poor, falling into poverty, re-entering into poverty, exiting from poverty and never poor households. Estimation results emphasize the effects of demographic and socioeconomic factors as well as unexpected events on the economic status of a household. It is found that remittances have positive impact on household’s economic status in that they are likely to lower the probability of falling into poverty or trapping in poverty while they tend to increase the probability of exiting from poverty. In addition, not only receiving a secondary source of household income can raise the probability of being a never poor household, but it also significantly increases household income per capita of the chronically poor and falling into poverty households. Public work programs are recommended as an important tool to relieve household financial burden and uncertainty and thus consequently increase a chance for households to escape from poverty.

Keywords: difference in difference, dynamic, multinomial logit model, panel data, poverty, quantile regression, remittance, sequential logit model, Thailand, transfer

Procedia PDF Downloads 106
7361 Identification of Classes of Bilinear Time Series Models

Authors: Anthony Usoro

Abstract:

In this paper, two classes of bilinear time series model are obtained under certain conditions from the general bilinear autoregressive moving average model. Bilinear Autoregressive (BAR) and Bilinear Moving Average (BMA) Models have been identified. From the general bilinear model, BAR and BMA models have been proved to exist for q = Q = 0, => j = 0, and p = P = 0, => i = 0 respectively. These models are found useful in modelling most of the economic and financial data.

Keywords: autoregressive model, bilinear autoregressive model, bilinear moving average model, moving average model

Procedia PDF Downloads 398
7360 Channels Splitting Strategy for Optical Local Area Networks of Passive Star Topology

Authors: Peristera Baziana

Abstract:

In this paper, we present a network configuration for a WDM LANs of passive star topology that assume that the set of data WDM channels is split into two separate sets of channels, with different access rights over them. Especially, a synchronous transmission WDMA access algorithm is adopted in order to increase the probability of successful transmission over the data channels and consequently to reduce the probability of data packets transmission cancellation in order to avoid the data channels collisions. Thus, a control pre-transmission access scheme is followed over a separate control channel. An analytical Markovian model is studied and the average throughput is mathematically derived. The performance is studied for several numbers of data channels and various values of control phase duration.

Keywords: access algorithm, channels division, collisions avoidance, wavelength division multiplexing

Procedia PDF Downloads 289
7359 Development of a Human Vibration Model Considering Muscles and Stiffness of Intervertebral Discs

Authors: Young Nam Jo, Moon Jeong Kang, Hong Hee Yoo

Abstract:

Most human vibration models have been modeled as a multibody system consisting of some rigid bodies and spring-dampers. These models are developed for certain posture and conditions. So, the models cannot be used in vibration analysis in various posture and conditions. The purpose of this study is to develop a human vibration model that represent human vibration characteristics under various conditions by employing a musculoskeletal model. To do this, the human vibration model is developed based on biomechanical models. In addition, muscle models are employed instead of spring-dampers. Activations of muscles are controlled by PD controller to maintain body posture under vertical vibration is applied. Each gain value of the controller is obtained to minimize the difference of apparent mass and acceleration transmissibility between experim ent and analysis by using an optimization method.

Keywords: human vibration analysis, hill type muscle model, PD control, whole-body vibration

Procedia PDF Downloads 444
7358 Circuit Models for Conducted Susceptibility Analyses of Multiconductor Shielded Cables

Authors: Saih Mohamed, Rouijaa Hicham, Ghammaz Abdelilah

Abstract:

This paper presents circuit models to analyze the conducted susceptibility of multiconductor shielded cables in frequency domains using Branin’s method, which is referred to as the method of characteristics. These models, Which can be used directly in the time and frequency domains, take into account the presence of both the transfer impedance and admittance. The conducted susceptibility is studied by using an injection current on the cable shield as the source. Two examples are studied, a coaxial shielded cable and shielded cables with two parallel wires (i.e., twinax cables). This shield has an asymmetry (one slot on the side). Results obtained by these models are in good agreement with those obtained by other methods.

Keywords: circuit models, multiconductor shielded cables, Branin’s method, coaxial shielded cable, twinax cables

Procedia PDF Downloads 507
7357 A Probability Analysis of Construction Project Schedule Using Risk Management Tool

Authors: A. L. Agarwal, D. A. Mahajan

Abstract:

Construction industry tumbled along with other industry/sectors during recent economic crash. Construction business could not regain thereafter and still pass through slowdown phase, resulted many real estate as well as infrastructure projects not completed on schedule and within budget. There are many theories, tools, techniques with software packages available in the market to analyze construction schedule. This study focuses on the construction project schedule and uncertainties associated with construction activities. The infrastructure construction project has been considered for the analysis of uncertainty on project activities affecting project duration and analysis is done using @RISK software. Different simulation results arising from three probability distribution functions are compiled to benefit construction project managers to plan more realistic schedule of various construction activities as well as project completion to document in the contract and avoid compensations or claims arising out of missing the planned schedule.

Keywords: construction project, distributions, project schedule, uncertainty

Procedia PDF Downloads 342
7356 On the Evaluation of Different Turbulence Models through the Displacement of Oil-Water Flow in Porous Media

Authors: Sidique Gawusu, Xiaobing Zhang

Abstract:

Turbulence models play a significant role in all computational fluid dynamics based modelling approaches. There is, however, no general turbulence model suitable for all flow scenarios. Therefore, a successful numerical modelling approach is only achievable if a more appropriate closure model is used. This paper evaluates different turbulence models in numerical modelling of oil-water flow within the Eulerian-Eulerian approach. A comparison among the obtained numerical results and published benchmark data showed reasonable agreement. The domain was meshed using structured mesh, and grid test was performed to ascertain grid independence. The evaluation of the models was made through analysis of velocity and pressure profiles across the domain. The models were tested for their suitability to accurately obtain a scalable and precise numerical experience. As a result, it is found that all the models except Standard-ω provide comparable results. The study also revealed new insights on flow in porous media, specifically oil reservoirs.

Keywords: turbulence modelling, simulation, multi-phase flows, water-flooding, heavy oil

Procedia PDF Downloads 268
7355 Modelling High-Frequency Crude Oil Dynamics Using Affine and Non-Affine Jump-Diffusion Models

Authors: Katja Ignatieva, Patrick Wong

Abstract:

We investigated the dynamics of high frequency energy prices, including crude oil and electricity prices. The returns of underlying quantities are modelled using various parametric models such as stochastic framework with jumps and stochastic volatility (SVCJ) as well as non-parametric alternatives, which are purely data driven and do not require specification of the drift or the diffusion coefficient function. Using different statistical criteria, we investigate the performance of considered parametric and nonparametric models in their ability to forecast price series and volatilities. Our models incorporate possible seasonalities in the underlying dynamics and utilise advanced estimation techniques for the dynamics of energy prices.

Keywords: stochastic volatility, affine jump-diffusion models, high frequency data, model specification, markov chain monte carlo

Procedia PDF Downloads 93
7354 The Reliability Analysis of Concrete Chimneys Due to Random Vortex Shedding

Authors: Saba Rahman, Arvind K. Jain, S. D. Bharti, T. K. Datta

Abstract:

Chimneys are generally tall and slender structures with circular cross-sections, due to which they are highly prone to wind forces. Wind exerts pressure on the wall of the chimneys, which produces unwanted forces. Vortex-induced oscillation is one of such excitations which can lead to the failure of the chimneys. Therefore, vortex-induced oscillation of chimneys is of great concern to researchers and practitioners since many failures of chimneys due to vortex shedding have occurred in the past. As a consequence, extensive research has taken place on the subject over decades. Many laboratory experiments have been performed to verify the theoretical models proposed to predict vortex-induced forces, including aero-elastic effects. Comparatively, very few proto-type measurement data have been recorded to verify the proposed theoretical models. Because of this reason, the theoretical models developed with the help of experimental laboratory data are utilized for analyzing the chimneys for vortex-induced forces. This calls for reliability analysis of the predictions of the responses of the chimneys produced due to vortex shedding phenomena. Although several works of literature exist on the vortex-induced oscillation of chimneys, including code provisions, the reliability analysis of chimneys against failure caused due to vortex shedding is scanty. In the present study, the reliability analysis of chimneys against vortex shedding failure is presented, assuming the uncertainty in vortex shedding phenomena to be significantly more than other uncertainties, and hence, the latter is ignored. The vortex shedding is modeled as a stationary random process and is represented by a power spectral density function (PSDF). It is assumed that the vortex shedding forces are perfectly correlated and act over the top one-third height of the chimney. The PSDF of the tip displacement of the chimney is obtained by performing a frequency domain spectral analysis using a matrix approach. For this purpose, both chimney and random wind forces are discretized over a number of points along with the height of the chimney. The method of analysis duly accounts for the aero-elastic effects. The double barrier threshold crossing level, as proposed by Vanmarcke, is used for determining the probability of crossing different threshold levels of the tip displacement of the chimney. Assuming the annual distribution of the mean wind velocity to be a Gumbel type-I distribution, the fragility curve denoting the variation of the annual probability of threshold crossing against different threshold levels of the tip displacement of the chimney is determined. The reliability estimate is derived from the fragility curve. A 210m tall concrete chimney with a base diameter of 35m, top diameter as 21m, and thickness as 0.3m has been taken as an illustrative example. The terrain condition is assumed to be that corresponding to the city center. The expression for the PSDF of the vortex shedding force is taken to be used by Vickery and Basu. The results of the study show that the threshold crossing reliability of the tip displacement of the chimney is significantly influenced by the assumed structural damping and the Gumbel distribution parameters. Further, the aero-elastic effect influences the reliability estimate to a great extent for small structural damping.

Keywords: chimney, fragility curve, reliability analysis, vortex-induced vibration

Procedia PDF Downloads 154
7353 Investigating Jacket-Type Offshore Structures Failure Probability by Applying the Reliability Analyses Methods

Authors: Majid Samiee Zonoozian

Abstract:

For such important constructions as jacket type platforms, scrupulous attention in analysis, design and calculation processes is needed. The reliability assessment method has been established into an extensively used method to behavior safety calculation of jacket platforms. In the present study, a methodology for the reliability calculation of an offshore jacket platform in contradiction of the extreme wave loading state is available. Therefore, sensitivity analyses are applied to acquire the nonlinear response of jacket-type platforms against extreme waves. The jacket structure is modeled by applying a nonlinear finite-element model with regards to the tubular members' behave. The probability of a member’s failure under extreme wave loading is figured by a finite-element reliability code. The FORM and SORM approaches are applied for the calculation of safety directories and reliability indexes have been detected. A case study for a fixed jacket-type structure positioned in the Persian Gulf is studied by means of the planned method. Furthermore, to define the failure standards, equations suggested by the 21st version of the API RP 2A-WSD for The jacket-type structures’ tubular members designing by applying the mixed axial bending and axial pressure. Consequently, the effect of wave Loades in the reliability index was considered.

Keywords: Jacket-Type structure, reliability, failure probability, tubular members

Procedia PDF Downloads 166
7352 Educational Leadership and Artificial Intelligence

Authors: Sultan Ghaleb Aldaihani

Abstract:

- The environment in which educational leadership takes place is becoming increasingly complex due to factors like globalization and rapid technological change. - This is creating a "leadership gap" where the complexity of the environment outpaces the ability of leaders to effectively respond. - Educational leadership involves guiding teachers and the broader school system towards improved student learning and achievement. 2. Implications of Artificial Intelligence (AI) in Educational Leadership: - AI has great potential to enhance education, such as through intelligent tutoring systems and automating routine tasks to free up teachers. - AI can also have significant implications for educational leadership by providing better information and data-driven decision-making capabilities. - Computer-adaptive testing can provide detailed, individualized data on student learning that leaders can use for instructional decisions and accountability. 3. Enhancing Decision-Making Processes: - Statistical models and data mining techniques can help identify at-risk students earlier, allowing for targeted interventions. - Probability-based models can diagnose students likely to drop out, enabling proactive support. - These data-driven approaches can make resource allocation and decision-making more effective. 4. Improving Efficiency and Productivity: - AI systems can automate tasks and change processes to improve the efficiency of educational leadership and administration. - Integrating AI can free up leaders to focus more on their role's human, interactive elements.

Keywords: Education, Leadership, Technology, Artificial Intelligence

Procedia PDF Downloads 26
7351 Role of ASHA in Utilizing Maternal Health Care Services India, Evidences from National Rural Health Mission (NRHM)

Authors: Dolly Kumari, H. Lhungdim

Abstract:

Maternal health is one of the crucial health indicators for any country. 5th goal of Millennium Development Goals is also emphasising on improvement of maternal health. Soon after Independence government of India realizing the importance of maternal and child health care services, and took steps to strengthen in 1st and 2nd five year plans. In past decade the other health indicator which is life expectancy at birth has been observed remarkable improvement. But still maternal mortality is high in India and in some states it is observe much higher than national average. Government of India pour lots of fund and initiate National Rural Health Mission (NRHM) in 2005 to improve maternal health in country by providing affordable and accessible health care services. Accredited Social Heath Activist (ASHA) is one of the key components of the NRHM. Mainly ASHAs are selected female aged 25-45 years from village itself and accountable for the monitoring of maternal health care for the same village. ASHA are trained to works as an interface between the community and public health system. This study tries to assess the role of ASHA in utilizing maternal health care services and to see the level of awareness about benefits given under JSY scheme and utilization of those benefits by eligible women. For the study concurrent evaluation data from National Rural health Mission (NRHM), initiated by government of India in 2005 has been used. This study is based on 78205 currently married women from 70 different districts of India. Descriptive statistics, chi2 test and binary logistic regression have been used for analysis. The probability of institutional delivery increases by 2.03 times (p<0.001) while if ASHA arranged or helped in arranging transport facility the probability of institutional delivery is increased by 1.67 times (p<0.01) than if she is not arranging transport facility. Further if ASHA facilitated to get JSY card to the pregnant women probability of going for full ANC is increases by 1.36 times (p<0.05) than reference. However if ASHA discuses about institutional delivery and approaches to get register than probability of getting TT injection is 1.88 and 1.64 times (p<0.01) higher than that if she did not discus. Further, Probability of benefits from JSY schemes is 1.25 times (p<0.001) higher among women who get married after 18 years. The probability of benefits from JSY schemes is 1.25 times (p<0.001) higher among women who get married after 18 year of age than before 18 years, it is also 1.28 times (p<0.001) and 1.32 times (p<0.001) higher among women have 1 to 8 year of schooling and with 9 and above years of schooling respectively than the women who never attended school. Those women who are working have 1.13 times (p<0.001) higher probability of getting benefits from JSY scheme than not working women. Surprisingly women belongs to wealthiest quintile are .53times (P<0.001) less aware about JSY scheme. Results conclude that work done by ASHA has great influence on maternal health care utilization in India. But results also show that still substantial numbers of needed population are far from utilization of these services. Place of delivery is significantly influenced by referral and transport facility arranged by ASHA.

Keywords: institutional delivery, JSY beneficiaries, referral faculty, public health

Procedia PDF Downloads 318
7350 Advanced Combinatorial Method for Solving Complex Fault Trees

Authors: José de Jesús Rivero Oliva, Jesús Salomón Llanes, Manuel Perdomo Ojeda, Antonio Torres Valle

Abstract:

Combinatorial explosion is a common problem to both predominant methods for solving fault trees: Minimal Cut Set (MCS) approach and Binary Decision Diagram (BDD). High memory consumption impedes the complete solution of very complex fault trees. Only approximated non-conservative solutions are possible in these cases using truncation or other simplification techniques. The paper proposes a method (CSolv+) for solving complex fault trees, without any possibility of combinatorial explosion. Each individual MCS is immediately discarded after its contribution to the basic events importance measures and the Top gate Upper Bound Probability (TUBP) has been accounted. An estimation of the Top gate Exact Probability (TEP) is also provided. Therefore, running in a computer cluster, CSolv+ will guarantee the complete solution of complex fault trees. It was successfully applied to 40 fault trees from the Aralia fault trees database, performing the evaluation of the top gate probability, the 1000 Significant MCSs (SMCS), and the Fussell-Vesely, RRW and RAW importance measures for all basic events. The high complexity fault tree nus9601 was solved with truncation probabilities from 10-²¹ to 10-²⁷ just to limit the execution time. The solution corresponding to 10-²⁷ evaluated 3.530.592.796 MCSs in 3 hours and 15 minutes.

Keywords: system reliability analysis, probabilistic risk assessment, fault tree analysis, basic events importance measures

Procedia PDF Downloads 33
7349 Study of Adsorption Isotherm Models on Rare Earth Elements Biosorption for Separation Purposes

Authors: Nice Vasconcelos Coimbra, Fábio dos Santos Gonçalves, Marisa Nascimento, Ellen Cristine Giese

Abstract:

The development of chemical routes for the recovery and separation of rare earth elements (REE) is seen as a priority and strategic action by several countries demanding these elements. Among the possibilities of alternative routes, the biosorption process has been evaluated in our laboratory. In this theme, the present work attempts to assess and fit the solution equilibrium data in Langmuir, Freundlich and DKR isothermal models, based on the biosorption results of the lanthanum and samarium elements by Bacillus subtilis immobilized on calcium alginate gel. It was observed that the preference of adsorption of REE by the immobilized biomass followed the order Sm (III)> La (III). It can be concluded that among the studied isotherms models, the Langmuir model presented better mathematical results than the Freundlich and DKR models.

Keywords: rare earth elements, biosorption, Bacillus subtilis, adsorption isotherm models

Procedia PDF Downloads 152
7348 High Motivational Salient Face Distractors Slowed Target Detection: Evidence from Behavioral Studies

Authors: Rashmi Gupta

Abstract:

Rewarding stimuli capture attention involuntarily as a result of an association process that develops quickly during value learning, referred to as the reward or value-driven attentional capture. It is essential to compare reward with punishment processing to get a full picture of value-based modulation in visual attention processing. Hence, the present study manipulated both valence/value (reward as well as punishment) and motivational salience (probability of an outcome: high vs. low) together. Series of experiments were conducted, and there were two phases in each experiment. In phase 1, participants were required to learn to associate specific face stimuli with a high or low probability of winning or losing points. In the second phase, these conditioned stimuli then served as a distractor or prime in a speeded letter search task. Faces with high versus low outcome probability, regardless of valence, slowed the search for targets (specifically the left visual field target) and suggesting that the costs to performance on non-emotional cognitive tasks were only driven by motivational salience (high vs. loss) associated with the stimuli rather than the valence (gain vs. loss). It also suggests that the processing of motivationally salient stimuli is right-hemisphere biased. Together, results of these studies strengthen the notion that our visual attention system is more sensitive to affected by motivational saliency rather than valence, which termed here as motivational-driven attentional capture.

Keywords: attention, distractors, motivational salience, valence

Procedia PDF Downloads 216
7347 A Parametric Study on Effects of Internal Factors on Carbonation of Reinforced Concrete

Authors: Kunal Tongaria, Abhishek Mangal, S. Mandal, Devendra Mohan

Abstract:

The carbonation of concrete is a phenomenon which is a function of various interdependent parameters. Therefore, in spite of numerous literature and database, the useful generalization is not an easy task. These interdependent parameters can be grouped under the category of internal and external factors. This paper focuses on the internal parameters which govern and increase the probability of the ingress of deleterious substances into concrete. The mechanism of effects of internal parameters such as microstructure for with and without supplementary cementing materials (SCM), water/binder ratio, the age of concrete etc. has been discussed. This is followed by the comparison of various proposed mathematical models for the deterioration of concrete. Based on existing laboratory experiments as well as field results, this paper concludes the present understanding of mechanism, modeling and future research needs in this field.

Keywords: carbonation, diffusion coefficient, microstructure of concrete, reinforced concrete

Procedia PDF Downloads 403
7346 Reconstructability Analysis for Landslide Prediction

Authors: David Percy

Abstract:

Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.

Keywords: reconstructability analysis, machine learning, landslides, raster analysis

Procedia PDF Downloads 54
7345 Archaeology Study of Soul Houses in Ancient Egypt on Five Models in the Grand Egyptian Museum

Authors: Mahmoud Aly, Mohamed Ismail, Mohamed Badereldin, Amro Mostafa

Abstract:

Introduction: The models of soul houses were appeared in the prehistory, old kingdom, and middle kingdom period. They represented the imagination of the deceased about his house in the afterlife, some of these soul houses were two floors, and the study will examine five models of soul houses which were discovered near Saqqara site by an Egyptian mission. These models had been transferred to The Grand Egyptian Museum (GEM) to be ready to display at the new museum. We focus upon five models of soul houses (GEM Numbers, 1276,1280,1281,1282,8711) they related to the old kingdom period. These models were all made of pottery, the five models have oval shape and were decorated with relief. Methodology: The study will focus on the development of soul houses during the different periods in ancient Egypt and the kinds of offerings which will reflect the economic situation in the Egyptian society and kinds of oils which were famous in ancient Egypt. Conclusion: This research focuses on the function of soul house and the kind of offerings which were put in it, This study will be useful for the heritage and ancient civilizations, specially when we talk about opening new museums like The Grand Egyptian Museum, which will display a new collection of soul houses.

Keywords: archaeology study, grand egyptian museum, relief, soul houses

Procedia PDF Downloads 77
7344 Shock Compressibility of Iron Alloys Calculated in the Framework of Quantum-Statistical Models

Authors: Maxim A. Kadatskiy, Konstantin V. Khishchenko

Abstract:

Iron alloys are widespread components in various types of structural materials which are exposed to intensive thermal and mechanical loads. Various quantum-statistical cell models with the approximation of self-consistent field can be used for the prediction of the behavior of these materials under extreme conditions. The application of these models is even more valid, the higher the temperature and the density of matter. Results of Hugoniot calculation for iron alloys in the framework of three quantum-statistical (the Thomas–Fermi, the Thomas–Fermi with quantum and exchange corrections and the Hartree–Fock–Slater) models are presented. Results of quantum-statistical calculations are compared with results from other reliable models and available experimental data. It is revealed a good agreement between results of calculation and experimental data for terra pascal pressures. Advantages and disadvantages of this approach are shown.

Keywords: alloy, Hugoniot, iron, terapascal pressure

Procedia PDF Downloads 335