Search results for: technological pedagogical model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18295

Search results for: technological pedagogical model

10855 Refinement of Existing Benzthiazole lead Targeting Lysine Aminotransferase in Dormant Stage of Mycobacterium tuberculosis

Authors: R. Reshma srilakshmi, S. Shalini, P. Yogeeswari, D. Sriram

Abstract:

Lysine aminotransferase is a crucial enzyme for dormancy in M. tuberculosis. It is involved in persistence and antibiotic resistance. In present work, we attempted to develop benzthiazole derivatives as lysine aminotransferase inhibitors. In our attempts, we also unexpectedly arrived at an interesting compound 21 (E)-4-(5-(2-(benzo[d]thiazol-2-yl)-2-cyanovinyl)thiophen-2-yl)benzoic acid which even though has moderate activity against persistent phase of mycobacterium, it has significant potency against active phase. In the entire series compound 22 (E)-4-(5-(2-(benzo[d]thiazol-2-yl)-2-cyanovinyl)thiophen-2-yl)isophthalic acid emerged as potent molecule with LAT IC50 of 2.62 µM. It has a significant log reduction of 2.9 and 2.3 fold against nutrient starved and biofilm forming mycobacteria. It was found to be inactive in MABA assay and M.marinum induced zebra fish model. It is also devoid of cytotoxicity. Compound 22 was also found to possess bactericidal effect which is independent of concentration and time. It was found to be effective in combination with Rifampicin in 3D granuloma model. The results are very encouraging as the hit molecule shows activity against active as well as persistent forms of tuberculosis. The identified hit needs further more pharmacokinetic and dynamic screening for development as new drug candidate.

Keywords: benzothiazole, latent tuberculosis, LAT, nutrient starvation

Procedia PDF Downloads 317
10854 A Hybrid Algorithm for Collaborative Transportation Planning among Carriers

Authors: Elham Jelodari Mamaghani, Christian Prins, Haoxun Chen

Abstract:

In this paper, there is concentration on collaborative transportation planning (CTP) among multiple carriers with pickup and delivery requests and time windows. This problem is a vehicle routing problem with constraints from standard vehicle routing problems and new constraints from a real-world application. In the problem, each carrier has a finite number of vehicles, and each request is a pickup and delivery request with time window. Moreover, each carrier has reserved requests, which must be served by itself, whereas its exchangeable requests can be outsourced to and served by other carriers. This collaboration among carriers can help them to reduce total transportation costs. A mixed integer programming model is proposed to the problem. To solve the model, a hybrid algorithm that combines Genetic Algorithm and Simulated Annealing (GASA) is proposed. This algorithm takes advantages of GASA at the same time. After tuning the parameters of the algorithm with the Taguchi method, the experiments are conducted and experimental results are provided for the hybrid algorithm. The results are compared with those obtained by a commercial solver. The comparison indicates that the GASA significantly outperforms the commercial solver.

Keywords: centralized collaborative transportation, collaborative transportation with pickup and delivery, collaborative transportation with time windows, hybrid algorithm of GA and SA

Procedia PDF Downloads 376
10853 Detecting Port Maritime Communities in Spain with Complex Network Analysis

Authors: Nicanor Garcia Alvarez, Belarmino Adenso-Diaz, Laura Calzada Infante

Abstract:

In recent years, researchers have shown an interest in modelling maritime traffic as a complex network. In this paper, we propose a bipartite weighted network to model maritime traffic and detect port maritime communities. The bipartite weighted network considers two different types of nodes. The first one represents Spanish ports, while the second one represents the countries with which there is major import/export activity. The flow among both types of nodes is modeled by weighting the volume of product transported. To illustrate the model, the data is segmented by each type of traffic. This will allow fine tuning and the creation of communities for each type of traffic and therefore finding similar ports for a specific type of traffic, which will provide decision-makers with tools to search for alliances or identify their competitors. The traffic with the greatest impact on the Spanish gross domestic product is selected, and the evolution of the communities formed by the most important ports and their differences between 2019 and 2009 will be analyzed. Finally, the set of communities formed by the ports of the Spanish port system will be inspected to determine global similarities between them, analyzing the sum of the membership of the different ports in communities formed for each type of traffic in particular.

Keywords: bipartite networks, competition, infomap, maritime traffic, port communities

Procedia PDF Downloads 136
10852 Numinous Luminosity: A Mixed Methods Study of Mystical Light Experiences

Authors: J. R. Dinsmore, R. W. Hood

Abstract:

Experiences of a divine or mystical light are frequently reported in religious/spiritual experiences today, most notably in the context of mystical and near-death experiences. Light of a transcendental nature and its experiences of it are also widely present and highly valued in many religious and mystical traditions. Despite the significance of this luminosity to the topic of religious experience, efforts to study the phenomenon empirically have been minimal and scattered. This mixed methods study developed and validated a questionnaire for the measurement of numinous luminosity experience and investigated the dimensions and effects of this novel construct using both quantitative and qualitative methodologies. A sequential explanatory design (participant selection model) was used, which involved a scale development phase, followed by a correlational study testing hypotheses about its effects on beliefs and well-being derived from the literature, and lastly, a phenomenological study of a sample selected from the correlational phase results. The outcomes of the study are a unified theoretical model of numinous luminosity experience across multiple experiential contexts, initial correlational findings regarding the possible mechanism of its reported positive transformational effects, and a valid and reliable instrument for its further empirical study.

Keywords: religious experience, mystical experience, near-death experience, scale development, questionnaire, divine light, mystical light, mystical luminosity

Procedia PDF Downloads 80
10851 Method for Auto-Calibrate Projector and Color-Depth Systems for Spatial Augmented Reality Applications

Authors: R. Estrada, A. Henriquez, R. Becerra, C. Laguna

Abstract:

Spatial Augmented Reality is a variation of Augmented Reality where the Head-Mounted Display is not required. This variation of Augmented Reality is useful in cases where the need for a Head-Mounted Display itself is a limitation. To achieve this, Spatial Augmented Reality techniques substitute the technological elements of Augmented Reality; the virtual world is projected onto a physical surface. To create an interactive spatial augmented experience, the application must be aware of the spatial relations that exist between its core elements. In this case, the core elements are referred to as a projection system and an input system, and the process to achieve this spatial awareness is called system calibration. The Spatial Augmented Reality system is considered calibrated if the projected virtual world scale is similar to the real-world scale, meaning that a virtual object will maintain its perceived dimensions when projected to the real world. Also, the input system is calibrated if the application knows the relative position of a point in the projection plane and the RGB-depth sensor origin point. Any kind of projection technology can be used, light-based projectors, close-range projectors, and screens, as long as it complies with the defined constraints; the method was tested on different configurations. The proposed procedure does not rely on a physical marker, minimizing the human intervention on the process. The tests are made using a Kinect V2 as an input sensor and several projection devices. In order to test the method, the constraints defined were applied to a variety of physical configurations; once the method was executed, some variables were obtained to measure the method performance. It was demonstrated that the method obtained can solve different arrangements, giving the user a wide range of setup possibilities.

Keywords: color depth sensor, human computer interface, interactive surface, spatial augmented reality

Procedia PDF Downloads 112
10850 Numerical Predictions of Trajectory Stability of a High-Speed Water-Entry and Water-Exit Projectile

Authors: Lin Lu, Qiang Li, Tao Cai, Pengjun Zhang

Abstract:

In this study, a detailed analysis of trajectory stability and flow characteristics of a high-speed projectile during the water-entry and water-exit process has been investigated numerically. The Zwart-Gerber-Belamri (Z-G-B) cavitation model and the SST k-ω turbulence model based on the Reynolds Averaged Navier-Stokes (RANS) method are employed. The numerical methodology is validated by comparing the experimental photograph of cavitation shape and the experimental underwater velocity with the numerical simulation results. Based on the numerical methodology, the influences of rotational speed, water-entry and water-exit angle of the projectile on the trajectory stability and flow characteristics have been carried out in detail. The variation features of projectile trajectory and total resistance have been conducted, respectively. In addition, the cavitation characteristics of water-entry and water-exit have been presented and analyzed. Results show that it may not be applicable for the water-entry and water-exit to achieve the projectile stability through the rotation of projectile. Furthermore, there ought to be a critical water-entry angle for the water-entry stability of practical projectile. The impact of water-exit angle on the trajectory stability and cavity phenomenon is not as remarkable as that of the water-entry angle.

Keywords: cavitation characteristics, high-speed projectile, numerical predictions, trajectory stability, water-entry, water-exit

Procedia PDF Downloads 125
10849 The Reliability and Shape of the Force-Power-Velocity Relationship of Strength-Trained Males Using an Instrumented Leg Press Machine

Authors: Mark Ashton Newman, Richard Blagrove, Jonathan Folland

Abstract:

The force-velocity profile of an individual has been shown to influence success in ballistic movements, independent of the individuals' maximal power output; therefore, effective and accurate evaluation of an individual’s F-V characteristics and not solely maximal power output is important. The relatively narrow range of loads typically utilised during force-velocity profiling protocols due to the difficulty in obtaining force data at high velocities may bring into question the accuracy of the F-V slope along with predictions pertaining to the maximum force that the system can produce at a velocity of null (F₀) and the theoretical maximum velocity against no load (V₀). As such, the reliability of the slope of the force-velocity profile, as well as V₀, has been shown to be relatively poor in comparison to F₀ and maximal power, and it has been recommended to assess velocity at loads closer to both F₀ and V₀. The aim of the present study was to assess the relative and absolute reliability of an instrumented novel leg press machine which enables the assessment of force and velocity data at loads equivalent to ≤ 10% of one repetition maximum (1RM) through to 1RM during a ballistic leg press movement. The reliability of maximal and mean force, velocity, and power, as well as the respective force-velocity and power-velocity relationships and the linearity of the force-velocity relationship, were evaluated. Sixteen male strength-trained individuals (23.6 ± 4.1 years; 177.1 ± 7.0 cm; 80.0 ± 10.8 kg) attended four sessions; during the initial visit, participants were familiarised with the leg press, modified to include a mounted force plate (Type SP3949, Force Logic, Berkshire, UK) and a Micro-Epsilon WDS-2500-P96 linear positional transducer (LPT) (Micro-Epsilon, Merseyside, UK). Peak isometric force (IsoMax) and a dynamic 1RM, both from a starting position of 81% leg length, were recorded for the dominant leg. Visits two to four saw the participants carry out the leg press movement at loads equivalent to ≤ 10%, 30%, 50%, 70%, and 90% 1RM. IsoMax was recorded during each testing visit prior to dynamic F-V profiling repetitions. The novel leg press machine used in the present study appears to be a reliable tool for measuring F and V-related variables across a range of loads, including velocities closer to V₀ when compared to some of the findings within the published literature. Both linear and polynomial models demonstrated good to excellent levels of reliability for SFV and F₀ respectively, with reliability for V₀ being good using a linear model but poor using a 2nd order polynomial model. As such, a polynomial regression model may be most appropriate when using a similar unilateral leg press setup to predict maximal force production capabilities due to only a 5% difference between F₀ and obtained IsoMax values with a linear model being best suited to predict V₀.

Keywords: force-velocity, leg-press, power-velocity, profiling, reliability

Procedia PDF Downloads 38
10848 Enhancing Sell-In and Sell-Out Forecasting Using Ensemble Machine Learning Method

Authors: Vishal Das, Tianyi Mao, Zhicheng Geng, Carmen Flores, Diego Pelloso, Fang Wang

Abstract:

Accurate sell-in and sell-out forecasting is a ubiquitous problem in the retail industry. It is an important element of any demand planning activity. As a global food and beverage company, Nestlé has hundreds of products in each geographical location that they operate in. Each product has its sell-in and sell-out time series data, which are forecasted on a weekly and monthly scale for demand and financial planning. To address this challenge, Nestlé Chilein collaboration with Amazon Machine Learning Solutions Labhas developed their in-house solution of using machine learning models for forecasting. Similar products are combined together such that there is one model for each product category. In this way, the models learn from a larger set of data, and there are fewer models to maintain. The solution is scalable to all product categories and is developed to be flexible enough to include any new product or eliminate any existing product in a product category based on requirements. We show how we can use the machine learning development environment on Amazon Web Services (AWS) to explore a set of forecasting models and create business intelligence dashboards that can be used with the existing demand planning tools in Nestlé. We explored recent deep learning networks (DNN), which show promising results for a variety of time series forecasting problems. Specifically, we used a DeepAR autoregressive model that can group similar time series together and provide robust predictions. To further enhance the accuracy of the predictions and include domain-specific knowledge, we designed an ensemble approach using DeepAR and XGBoost regression model. As part of the ensemble approach, we interlinked the sell-out and sell-in information to ensure that a future sell-out influences the current sell-in predictions. Our approach outperforms the benchmark statistical models by more than 50%. The machine learning (ML) pipeline implemented in the cloud is currently being extended for other product categories and is getting adopted by other geomarkets.

Keywords: sell-in and sell-out forecasting, demand planning, DeepAR, retail, ensemble machine learning, time-series

Procedia PDF Downloads 233
10847 E-Governance: A Key for Improved Public Service Delivery

Authors: Ayesha Akbar

Abstract:

Public service delivery has witnessed a significant improvement with the integration of information communication technology (ICT). It not only improves management structure with advanced technology for surveillance of service delivery but also provides evidence for informed decisions and policy. Pakistan’s public sector organizations have not been able to produce some good results to ensure service delivery. Notwithstanding, some of the public sector organizations in Pakistan has diffused modern technology and proved their credence by providing better service delivery standards. These good indicators provide sound basis to integrate technology in public sector organizations and shift of policy towards evidence based policy making. Rescue-1122 is a public sector organization which provides emergency services and proved to be a successful model for the provision of service delivery to save human lives and to ensure human development in Pakistan. The information about the organization has been received by employing qualitative research methodology. The information is broadly based on primary and secondary sources which includes Rescue-1122 website, official reports of organizations; UNDP (United Nation Development Program), WHO (World Health Organization) and by conducting 10 in-depth interviews with the high administrative staff of organizations who work in the Lahore offices. The information received has been incorporated with the study for the better understanding of the organization and their management procedures. Rescue-1122 represents a successful model in delivering the services in an efficient way to deal with the disaster management. The management of Rescue has strategized the policies and procedures in such a way to develop a comprehensive model with the integration of technology. This model provides efficient service delivery as well as maintains the standards of the organization. The service delivery model of rescue-1122 works on two fronts; front-office interface and the back-office interface. Back-office defines the procedures of operations and assures the compliance of the staff whereas, front-office equipped with the latest technology and good infrastructure handles the emergency calls. Both ends are integrated with satellite based vehicle tracking, wireless system, fleet monitoring system and IP camera which monitors every move of the staff to provide better services and to pinpoint the distortions in the services. The standard time of reaching to the emergency spot is 7 minutes, and during entertaining the case; driver‘s behavior, traffic volume and the technical assistance being provided to the emergency case is being monitored by front-office. Then the whole information get uploaded to the main dashboard of Lahore headquarter from the provincial offices. The latest technology is being materialized by Rescue-1122 for delivering the efficient services, investigating the flaws; if found, and to develop data to make informed decision making. The other public sector organizations of Pakistan can also develop such models to integrate technology for improving service delivery and to develop evidence for informed decisions and policy making.

Keywords: data, e-governance, evidence, policy

Procedia PDF Downloads 228
10846 A Review on Bioremediation of Waste Effluent Associated with Pulp and Paper Industry

Authors: Adamu Muhammed Tukur

Abstract:

Pulp and paper industry is one of the fastest growing industries due to an increased demand in paper products. For it to satisfy this ever increasing demand, it adopts new technological innovations some of which are proved to affect our environment negatively. Global consumption of paper has increased by 400% in the last four decades and this suggests that more research is required to assess the impact of industrial effluents to our environment and public health. Paper products are generally biodegradable, however, the processes involved in its production which involve the use of mainly bleaching agents and other non-biodegradable substances pose serious problem to the environment. There are more than 250 chemicals released in paper mill waste and some are xenobiotics. Different methods such as physical and chemical methods can be adopted for the remediation of the effluents but are proved to be costly and not safe to the environment. On the other hand, biological method is shown to be less costly and environmentally friendly. Microorganisms and their enzymes have shown a promising future for bioremediation of effluents related to paper mill. Many studies prove that one of the major pollutants in the paper mill effluent is phenol especially its chlorinated derivatives. Pentachlorophenol is extremely hazardous to living cells and therefore need to be removed from the environment. Microorganisms including bacteria and fungi have the potential to degrade phenolic compounds e.g. Bacillus stearothermiphilus, Pseudomonas putida, Coricus versicolor, Sphingomonas chlorophenolica, Fusarium sp, Bacillus subtilis and P. aeroginosa. Enzymes used for the degradation include phenol hydrooxylase, polyphenoloxylase, laccase, peroxidase among others. Lignin is another important pollutant and is resistant to microbial degradation but it has been proved that certain bacteria and fungi like can degrade it. Among the fungi white-rot fungi like Fomes lividus and Trametes vesicolor are the most important bioremediators. This review focused on use of microorganism to reduce or eradicate pollutants released from the paper industry. It can serve as a review for further research to be conducted especially in the field of Biotechnology.

Keywords: bioremediation, pulp and paper, pentachlorophenol, environment

Procedia PDF Downloads 308
10845 Comparisons of Co-Seismic Gravity Changes between GRACE Observations and the Predictions from the Finite-Fault Models for the 2012 Mw = 8.6 Indian Ocean Earthquake Off-Sumatra

Authors: Armin Rahimi

Abstract:

The Gravity Recovery and Climate Experiment (GRACE) has been a very successful project in determining math redistribution within the Earth system. Large deformations caused by earthquakes are in the high frequency band. Unfortunately, GRACE is only capable to provide reliable estimate at the low-to-medium frequency band for the gravitational changes. In this study, we computed the gravity changes after the 2012 Mw8.6 Indian Ocean earthquake off-Sumatra using the GRACE Level-2 monthly spherical harmonic (SH) solutions released by the University of Texas Center for Space Research (UTCSR). Moreover, we calculated gravity changes using different fault models derived from teleseismic data. The model predictions showed non-negligible discrepancies in gravity changes. However, after removing high-frequency signals, using Gaussian filtering 350 km commensurable GRACE spatial resolution, the discrepancies vanished, and the spatial patterns of total gravity changes predicted from all slip models became similar at the spatial resolution attainable by GRACE observations, and predicted-gravity changes were consistent with the GRACE-detected gravity changes. Nevertheless, the fault models, in which give different slip amplitudes, proportionally lead to different amplitude in the predicted gravity changes.

Keywords: undersea earthquake, GRACE observation, gravity change, dislocation model, slip distribution

Procedia PDF Downloads 342
10844 Nanopriming Potential of Metal Nanoparticles against Internally Seed Borne Pathogen Ustilago triciti

Authors: Anjali Sidhu, Anju Bala, Amit Kumar

Abstract:

Metal nanoparticles have the potential to revolutionize the agriculture owing to sizzling interdisciplinary nano-technological application domain. Numerous patents and products incorporating engineered nanoparticles (NPs) entered into agro-applications with the collective goal to promote proficiency as well as sustainability with lower input and generating meager waste than conventional products and approaches. Loose smut of wheat caused by Ustilago segetum tritici is an internally seed-borne pathogen. It is dormant in the seed unless the seed germinates and its symptoms are expressed at the reproductive stage of the plant only. Various seed treatment agents are recommended for this disease but due to the inappropriate methods of seed treatments used by farmers, each and every seed may not get treated, and the infected seeds escape the fungicidal action. The antimicrobial potential and small size of nanoparticles made them the material of choice as they could enter each seed and restrict the pathogen inside the seed due to the availability of more number of nanoparticles per unit volume of the nanoformulations. Nanoparticles of diverse nature known for their in vitro antimicrobial activity viz. ZnO, MgO, CuS and AgNPs were synthesized, surface modified and characterized by traditional methods. They were applied on infected wheat seeds which were then grown in pot conditions, and their mycelium was tracked in the shoot and leaf region of the seedlings by microscopic staining techniques. Mixed responses of inhibition of this internal mycelium were observed. The time and method of application concluded to be critical for application, which was optimised in the present work. The results implicated that there should be field trails to get final fate of these pot trails up to commercial level. The success of their field trials could be interpreted as a revolution to replace high dose organic fungicides of high residue behaviour.

Keywords: metal nanoparticles, nanopriming, seed borne pathogen, Ustilago segetum tritici

Procedia PDF Downloads 132
10843 Identification of Toxic Metal Deposition in Food Cycle and Its Associated Public Health Risk

Authors: Masbubul Ishtiaque Ahmed

Abstract:

Food chain contamination by heavy metals has become a critical issue in recent years because of their potential accumulation in bio systems through contaminated water, soil and irrigation water. Industrial discharge, fertilizers, contaminated irrigation water, fossil fuels, sewage sludge and municipality wastes are the major sources of heavy metal contamination in soils and subsequent uptake by crops. The main objectives of this project were to determine the levels of minerals, trace elements and heavy metals in major foods and beverages consumed by the poor and non-poor households of Dhaka city and assess the dietary risk exposure to heavy metal and trace metal contamination and potential health implications as well as recommendations for action. Heavy metals are naturally occurring elements that have a high atomic weight and a density of at least 5 times greater than that of water. Their multiple industrial, domestic, agricultural, medical and technological applications have led to their wide distribution in the environment; raising concerns over their potential effects on human health and the environment. Their toxicity depends on several factors including the dose, route of exposure, and chemical species, as well as the age, gender, genetics, and nutritional status of exposed individuals. Because of their high degree of toxicity, arsenic, cadmium, chromium, lead, and mercury rank among the priority metals that are of public health significance. These metallic elements are considered systemic toxicants that are known to induce multiple organ damage, even at lower levels of exposure. This review provides an analysis of their environmental occurrence, production and use, potential for human exposure, and molecular mechanisms of toxicity, and carcinogenicity.

Keywords: food chain, determine the levels of minerals, trace elements, heavy metals, production and use, human exposure, toxicity, carcinogenicity

Procedia PDF Downloads 268
10842 Finite Element Modeling of Ultrasonic Shot Peening Process using Multiple Pin Impacts

Authors: Chao-xun Liu, Shi-hong Lu

Abstract:

In spite of its importance to the aerospace and automobile industries, little or no attention has been devoted to the accurate modeling of the ultrasonic shot peening (USP) process. It is therefore the purpose of this study to conduct finite element analysis of the process using a realistic multiple pin impacts model with the explicit solver of ABAQUS. In this paper, we research the effect of several key parameters on the residual stress distribution within the target, including impact velocity, incident angle, friction coefficient between pins and target and impact number of times were investigated. The results reveal that the impact velocity and impact number of times have obvious effect and impacting vertically could produce the most perfect residual stress distribution. Then we compare the results with the date in USP experiment and verify the exactness of the model. The analysis of the multiple pin impacts date reveal the relationships between peening process parameters and peening quality, which are useful for identifying the parameters which need to be controlled and regulated in order to produce a more beneficial compressive residual stress distribution within the target.

Keywords: ultrasonic shot peening, finite element, multiple pins, residual stress, numerical simulation

Procedia PDF Downloads 438
10841 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications

Authors: H. Hruschka

Abstract:

This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.

Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models

Procedia PDF Downloads 181
10840 Surgical Planning for the Removal of Cranial Spheno-orbital Meningioma by Using Personalized Polymeric Prototypes Obtained with Additive Manufacturing Techniques

Authors: Freddy Patricio Moncayo-Matute, Pablo Gerardo Peña-Tapia, Vázquez-Silva Efrén, Paúl Bolívar Torres-Jara, Diana Patricia Moya-Loaiza, Gabriela Abad-Farfán

Abstract:

This study describes a clinical case and the results on the application of additive manufacturing for the surgical planning in the removal of a cranial spheno-orbital meningioma. It is verified that the use of personalized anatomical models and cutting guides helps to manage the cranial anomalies approach. The application of additive manufacturing technology: Fused Deposition Modeling (FDM), as a low-cost alternative, enables the printing of the test anatomical model, which in turn favors the reduction of surgery time, as well the morbidity rate reduction too. And the printing of the personalized cutting guide, which constitutes a valuable aid to the surgeon in terms of improving the intervention precision and reducing the invasive effect during the craniotomy. As part of the results, post-surgical follow-up is included as an instrument to verify the patient's recovery and the validity of the procedure.

Keywords: surgical planning, additive manufacturing, rapid prototyping, fused deposition modeling, custom anatomical model

Procedia PDF Downloads 80
10839 Assessing the Impact of Climate Change on Pulses Production in Khyber Pakhtunkhwa, Pakistan

Authors: Khuram Nawaz Sadozai, Rizwan Ahmad, Munawar Raza Kazmi, Awais Habib

Abstract:

Climate change and crop production are intrinsically associated with each other. Therefore, this research study is designed to assess the impact of climate change on pulses production in Southern districts of Khyber Pakhtunkhwa (KP) Province of Pakistan. Two pulses (i.e. chickpea and mung bean) were selected for this research study with respect to climate change. Climatic variables such as temperature, humidity and precipitation along with pulses production and area under cultivation of pulses were encompassed as the major variables of this study. Secondary data of climatic variables and crop variables for the period of thirty four years (1986-2020) were obtained from Pakistan Metrological Department and Agriculture Statistics of KP respectively. Panel data set of chickpea and mung bean crops was estimated separately. The analysis validate that both data sets were a balanced panel data. The Hausman specification test was run separately for both the panel data sets whose findings had suggested the fixed effect model can be deemed as an appropriate model for chickpea panel data, however random effect model was appropriate for estimation of the panel data of mung bean. Major findings confirm that maximum temperature is statistically significant for the chickpea yield. This implies if maximum temperature increases by 1 0C, it can enhance the chickpea yield by 0.0463 units. However, the impact of precipitation was reported insignificant. Furthermore, the humidity was statistically significant and has a positive association with chickpea yield. In case of mung bean the minimum temperature was significantly contributing in the yield of mung bean. This study concludes that temperature and humidity can significantly contribute to enhance the pulses yield. It is recommended that capacity building of pulses growers may be made to adapt the climate change strategies. Moreover, government may ensure the availability of climate change resistant varieties of pulses to encourage the pulses cultivation.

Keywords: climate change, pulses productivity, agriculture, Pakistan

Procedia PDF Downloads 32
10838 Self Tuning Controller for Reducing Cycle to Cycle Variations in SI Engine

Authors: Alirıza Kaleli, M. Akif Ceviz, Erdoğan Güner, Köksal Erentürk

Abstract:

The cyclic variations in spark ignition engines occurring especially under specific engine operating conditions make the maximum pressure variable for successive in-cylinder pressure cycles. Minimization of cyclic variations has a great importance in effectively operating near to lean limit, or at low speed and load. The cyclic variations may reduce the power output of the engine, lead to operational instabilities, and result in undesirable engine vibrations and noise. In this study, spark timing is controlled in order to reduce the cyclic variations in spark ignition engines. Firstly, an ARMAX model has developed between spark timing and maximum pressure using system identification techniques. By using this model, the maximum pressure of the next cycle has been predicted. Then, self-tuning minimum variance controller has been designed to change the spark timing for consecutive cycles of the first cylinder of test engine to regulate the in-cylinder maximum pressure. The performance of the proposed controller is illustrated in real time and experimental results show that the controller has a reliable effect on cycle to cycle variations of maximum cylinder pressure when the engine works under low speed conditions.

Keywords: cyclic variations, cylinder pressure, SI engines, self tuning controller

Procedia PDF Downloads 468
10837 An Integrated Assessment (IA) of Water Resources in the Speightstown Catchment, Barbados Using a GIS-Based Decision Support System

Authors: Anuradha Maharaj, Adrian Cashman

Abstract:

The cross-cutting nature of water as a resource translates into the need for a better understanding of its movement, storage and loss at all points in the hydro-socioeconomic cycle. An integrated approach to addressing the issue of sustainability means quantitatively understanding: the linkages within this cycle, the role of water managers in resource allocation, and the critical factors influencing its scarcity. The Water Evaluation and Planning Tool (WEAP) is an integrative model that combines the catchment-scale hydrologic processes with a water management model, driven by environmental requirements and socioeconomic demands. The concept of demand priorities is included to represent the areas of greatest use within a given catchment. Located on Barbados’ West Coast, Speightstown and the surrounding areas encompass a well-developed tourist, residential and agricultural area. The main water resource for this area, and the rest of the island, is that of groundwater. The availability of groundwater in Barbados may be adversely affected by the projected changes in climate, such as reduced wet season rainfall. Economic development and changing sector priorities together with climate related changes have the potential to affect water resource abundance and by extension the allocation of resources for example in the Speightstown area. In order to investigate the potential impacts on the Speightstown area specifically, a WEAP Model of the study area was developed to estimate the present available water (baseline reference scenario 2000-2010). From this baseline scenario, it is envisioned that an exploration into projected changes in availability in the near term (2035-2045) and medium/long term (2065-2075) time frames will be undertaken. The generated estimations can assist water managers to better evaluate the status of and identify trends in water use and formulate adaptation measures to offset future deficits.

Keywords: water evaluation and planning system (WEAP), water availability, demand and supply, water allocation

Procedia PDF Downloads 336
10836 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic

Authors: Fei Gao, Rodolfo C. Raga Jr.

Abstract:

This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.

Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle

Procedia PDF Downloads 56
10835 Investigating a Modern Accident Analysis Model for Textile Building Fires through Numerical Reconstruction

Authors: Mohsin Ali Shaikh, Weiguo Song, Rehmat Karim, Muhammad Kashan Surahio, Muhammad Usman Shahid

Abstract:

Fire investigations face challenges due to the complexity of fire development, and real-world accidents lack repeatability, making it difficult to apply standardized approaches. The unpredictable nature of fires and the unique conditions of each incident contribute to the complexity, requiring innovative methods and tools for effective analysis and reconstruction. This study proposes to provide the modern accident analysis model through numerical reconstruction for fire investigation in textile buildings. This method employs computer simulation to enhance the overall effectiveness of textile-building investigations. The materials and evidence collected from past incidents reconstruct fire occurrences, progressions, and catastrophic processes. The approach is demonstrated through a case study involving a tragic textile factory fire in Karachi, Pakistan, which claimed 257 lives. The reconstruction method proves invaluable for determining fire origins, assessing losses, establishing accountability, and, significantly, providing preventive insights for complex fire incidents.

Keywords: fire investigation, numerical simulation, fire safety, fire incident, textile building

Procedia PDF Downloads 54
10834 Developing a Web-Based Workflow Management System in Cloud Computing Platforms

Authors: Wang Shuen-Tai, Lin Yu-Ching, Chang Hsi-Ya

Abstract:

Cloud computing is the innovative and leading information technology model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort. In this paper, we aim at the development of workflow management system for cloud computing platforms based on our previous research on the dynamic allocation of the cloud computing resources and its workflow process. We took advantage of the HTML 5 technology and developed web-based workflow interface. In order to enable the combination of many tasks running on the cloud platform in sequence, we designed a mechanism and developed an execution engine for workflow management on clouds. We also established a prediction model which was integrated with job queuing system to estimate the waiting time and cost of the individual tasks on different computing nodes, therefore helping users achieve maximum performance at lowest payment. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for cloud computing platform. This development also helps boost user productivity by promoting a flexible workflow interface that lets users design and control their tasks' flow from anywhere.

Keywords: web-based, workflow, HTML5, Cloud Computing, Queuing System

Procedia PDF Downloads 297
10833 Teaching and Learning Jazz Improvisation Using Bloom's Taxonomy of Learning Domains

Authors: Graham Wood

Abstract:

The 20th Century saw the introduction of many new approaches to music making, including the structured and academic study of jazz improvisation. The rise of many school and tertiary jazz programs was rapid and quickly spread around the globe in a matter of decades. It could be said that the curriculum taught in these new programs was often developed in an ad-hoc manner due to the lack of written literature in this new and rapidly expanding area and the vastly different pedagogical principles when compared to classical music education that was prevalent in school and tertiary programs. There is widespread information regarding the theory and techniques used by jazz improvisers, but methods to practice these concepts in order to achieve the best outcomes for students and teachers is much harder to find. This research project explores the authors’ experiences as a studio jazz piano teacher, ensemble teacher and classroom improvisation lecturer over fifteen years and suggests an alignment with Bloom’s taxonomy of learning domains. This alignment categorizes the different tasks that need to be taught and practiced in order for the teacher and the student to devise a well balanced and effective practice routine and for the teacher to develop an effective teaching program. These techniques have been very useful to the teacher and the student to ensure that a good balance of cognitive, psychomotor and affective skills are taught to the students in a range of learning contexts.

Keywords: bloom, education, jazz, learning, music, teaching

Procedia PDF Downloads 242
10832 Exploring 'Attachment Theory' in the Context of Early Childhood Education

Authors: Wendy Lee

Abstract:

From the mid-twentieth century onward, the notion of ‘attachment’ has been used to define the optimum relationship between young children and their carers; first applied to parents and young children and more recently with early childhood educators and children in their care. However, it is seldom, if ever, asked whether the notion of ‘attachment’ and more especially so-called Attachment Theory, as propounded by John Bowlby and others, provides a sound basis for conceptualising child-adult relationships in early years. Even if appropriate in the context of family, the use of the term raises a number of questions when used in early childhood education. Research has shown that our youngest children (infants) in early childhood centre based care settings, are given the utmost priority to build 'attachments' with their educators. But exactly when, how and why does this priority diminish - and should it (for preschoolers)? This presentation will elaborate on such issues and will argue that there is a need to reconceptualise and redefine how 'quality relationships' should be measured and implemented in the daily practices and pedagogical methods adopted by early childhood educators. Moreover, this presentation will include data collected from the empirical study conducted, that observed various early childhood educators and children in Australian early childhood centres. Lastly, the thoughts, feelings and desires of parents of children in early childhood centre-based care, regarding the term 'attachment' and 'quality relationships' will be shared in the hope that we can take one step closer in bridging the needs of families, children, early childhood centres, educators, and the wider community.

Keywords: attachment, early childhood education, pedagogy, relationships

Procedia PDF Downloads 181
10831 Aspects Concerning Flame Propagation of Various Fuels in Combustion Chamber of Four Valve Engines

Authors: Zoran Jovanovic, Zoran Masonicic, S. Dragutinovic, Z. Sakota

Abstract:

In this paper, results concerning flame propagation of various fuels in a particular combustion chamber with four tilted valves were elucidated. Flame propagation was represented by the evolution of spatial distribution of temperature in various cut-planes within combustion chamber while the flame front location was determined by dint of zones with maximum temperature gradient. The results presented are only a small part of broader on-going scrutinizing activity in the field of multidimensional modeling of reactive flows in combustion chambers with complicated geometries encompassing various models of turbulence, different fuels and combustion models. In the case of turbulence two different models were applied i.e. standard k-ε model of turbulence and k-ξ-f model of turbulence. In this paper flame propagation results were analyzed and presented for two different hydrocarbon fuels, such as CH4 and C8H18. In the case of combustion all differences ensuing from different turbulence models, obvious for non-reactive flows are annihilated entirely. Namely the interplay between fluid flow pattern and flame propagation is invariant as regards turbulence models and fuels applied. Namely the interplay between fluid flow pattern and flame propagation is entirely invariant as regards fuel variation indicating that the flame propagation through unburned mixture of CH4 and C8H18 fuels is not chemically controlled.

Keywords: automotive flows, flame propagation, combustion modelling, CNG

Procedia PDF Downloads 277
10830 Reciprocity and Empathy in Motivating Altruism among Sixth Grade Students

Authors: Rylle Evan Gabriel Zamora, Micah Dennise Malia, Abygail Deniese Villabona

Abstract:

The primary motivators of altruism are usually viewed as mutually exclusive. In this study, we wanted to know if the two primary motivators, reciprocity and empathy, can work together in motivating altruism. Therefore, we wanted to find out if there is a significant interaction of effects between reciprocity and empathy. To show how this may occur, we devised the combined altruism model, which is based on Batson’s empathy altruism hypothesis. A sample of 120, 6th-grade students were randomly selected and then randomly assigned to four treatment groups. A 2x2 between subjects’ design was used, which had empathy and reciprocity as independent variables, and altruism as the dependent variable. The study made use of materials that were effort based, where subjects were required to complete a task or a puzzle to help a person in a given scenario, two videos, one to prime empathy were also used. This along with Witt & Boleman’s adapted Self-Reported Altruism Scale was used to determine an individual’s altruism. It was found that both variables were significant in motivating altruism, with empathy being the greater of the two. However, there was no significant interaction of effects between the two variables. To explain why this occurred, we turned to the combined altruism model, where it was found that when empathically primed, we tend to not think of ourselves when helping others. Future studies could focus on other variables, especially age which is said to be one of the greatest factors that influenced the results of the experiment.

Keywords: reciprocity, empathy, altruism, experimental psychology, social psychology

Procedia PDF Downloads 239
10829 Transitioning Teacher Identity during COVID-19: An Australian Early Childhood Education Perspective

Authors: J. Jebunnesa, Y. Budd, T. Mason

Abstract:

COVID-19 changed the pedagogical expectations of early childhood education as many teachers across Australia had to quickly adapt to new teaching practices such as remote teaching. An important factor in the successful implementation of any new teaching and learning approach is teacher preparation, however, due to the pandemic, the transformation to remote teaching was immediate. A timely question to be asked is how early childhood teachers managed the transition from face-to-face teaching to remote teaching and what was learned through this time. This study explores the experiences of early childhood educators in Australia during COVID-19 lockdowns. Data were collected from an online survey conducted through the official Facebook forum of “Early Childhood Education and Care Australia,” and a constructivist grounded theory methodology was used to analyse the data. Initial research results suggest changing expectations of teachers’ roles and responsibilities during the lockdown, with a significant category related to transitioning teacher identities emerging. The concept of transitioning represents the shift from the role of early childhood educator to educational innovator, essential worker, social worker, and health officer. The findings illustrate the complexity of early childhood educators’ roles during the pandemic.

Keywords: changing role of teachers, constructivist grounded theory, lessons learned, teaching during COVID-19

Procedia PDF Downloads 83
10828 A Study on Thermal and Flow Characteristics by Solar Radiation for Single-Span Greenhouse by Computational Fluid Dynamics Simulation

Authors: Jonghyuk Yoon, Hyoungwoon Song

Abstract:

Recently, there are lots of increasing interest in a smart farming that represents application of modern Information and Communication Technologies (ICT) into agriculture since it provides a methodology to optimize production efficiencies by managing growing conditions of crops automatically. In order to obtain high performance and stability for smart greenhouse, it is important to identify the effect of various working parameters such as capacity of ventilation fan, vent opening area and etc. In the present study, a 3-dimensional CFD (Computational Fluid Dynamics) simulation for single-span greenhouse was conducted using the commercial program, Ansys CFX 18.0. The numerical simulation for single-span greenhouse was implemented to figure out the internal thermal and flow characteristics. In order to numerically model solar radiation that spread over a wide range of wavelengths, the multiband model that discretizes the spectrum into finite bands of wavelength based on Wien’s law is applied to the simulation. In addition, absorption coefficient of vinyl varied with the wavelength bands is also applied based on Beer-Lambert Law. To validate the numerical method applied herein, the numerical results of the temperature at specific monitoring points were compared with the experimental data. The average error rates (12.2~14.2%) between them was shown and numerical results of temperature distribution are in good agreement with the experimental data. The results of the present study can be useful information for the design of various greenhouses. This work was supported by Korea Institute of Planning and Evaluation for Technology in Food, Agriculture, Forestry and Fisheries (IPET) through Advanced Production Technology Development Program, funded by Ministry of Agriculture, Food and Rural Affairs (MAFRA)(315093-03).

Keywords: single-span greenhouse, CFD (computational fluid dynamics), solar radiation, multiband model, absorption coefficient

Procedia PDF Downloads 121
10827 Artificial Intelligence Impact on Strategic Stability

Authors: Darius Jakimavicius

Abstract:

Artificial intelligence is the subject of intense debate in the international arena, identified both as a technological breakthrough and as a component of the strategic stability effect. Both the kinetic and non-kinetic development of AI and its application in the national strategies of the great powers may trigger a change in the security situation. Artificial intelligence is generally faster, more capable and more efficient than humans, and there is a temptation to transfer decision-making and control responsibilities to artificial intelligence. Artificial intelligence, which, once activated, can select and act on targets without further intervention by a human operator, blurs the boundary between human or robot (machine) warfare, or perhaps human and robot together. Artificial intelligence acts as a force multiplier that speeds up decision-making and reaction times on the battlefield. The role of humans is increasingly moving away from direct decision-making and away from command and control processes involving the use of force. It is worth noting that the autonomy and precision of AI systems make the process of strategic stability more complex. Deterrence theory is currently in a phase of development in which deterrence is undergoing further strain and crisis due to the complexity of the evolving models enabled by artificial intelligence. Based on the concept of strategic stability and deterrence theory, it is appropriate to develop further research on the development and impact of AI in order to assess AI from both a scientific and technical perspective: to capture a new niche in the scientific literature and academic terminology, to clarify the conditions for deterrence, and to identify the potential uses, impacts and possibly quantities of AI. The research problem is the impact of artificial intelligence developed by great powers on strategic stability. This thesis seeks to assess the impact of AI on strategic stability and deterrence principles, with human exclusion from the decision-making and control loop as a key axis. The interaction between AI and human actions and interests can determine fundamental changes in great powers' defense and deterrence, and the development and application of AI-based great powers strategies can lead to a change in strategic stability.

Keywords: artificial inteligence, strategic stability, deterrence theory, decision making loop

Procedia PDF Downloads 27
10826 Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks

Authors: Amirhossein Mohajerzadeh, Abolghasem Mohajerzadeh

Abstract:

In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.

Keywords: aggregation, estimation, queuing, wireless sensor network

Procedia PDF Downloads 174