Search results for: demand forecast updating
2583 The Effect of Satisfaction with the Internet on Online Shopping Attitude With TAM Approach Controlled By Gender
Authors: Velly Anatasia
Abstract:
In the last few decades extensive research has been conducted into information technology (IT) adoption, testing a series of factors considered to be essential for improved diffusion. Some studies analyze IT characteristics such as usefulness, ease of use and/or security, others focus on the emotions and experiences of users and a third group attempts to determine the importance of socioeconomic user characteristics such as gender, educational level and income. The situation is similar regarding e-commerce, where the majority of studies have taken for granted the importance of including these variables when studying e-commerce adoption, as these were believed to explain or forecast who buys or who will buy on the internet. Nowadays, the internet has become a marketplace suitable for all ages and incomes and both genders and thus the prejudices linked to the advisability of selling certain products should be revised. The objective of this study is to test whether the socioeconomic characteristics of experienced e-shoppers such as gender rally moderate the effect of their perceptions of online shopping behavior. Current development of the online environment and the experience acquired by individuals from previous e-purchases can attenuate or even nullify the effect of these characteristics. The individuals analyzed are experienced e-shoppers i.e. individuals who often make purchases on the internet. The Technology Acceptance Model (TAM) was broadened to include previous use of the internet and perceived self-efficacy. The perceptions and behavior of e-shoppers are based on their own experiences. The information obtained will be tested using questionnaires which were distributed and self-administered to respondent accustomed using internet. The causal model is estimated using structural equation modeling techniques (SEM), followed by tests of the moderating effect of socioeconomic variables on perceptions and online shopping behavior. The expected findings of this study indicated that gender moderate neither the influence of previous use of the internet nor the perceptions of e-commerce. In short, they do not condition the behavior of the experienced e-shopper.Keywords: Internet shopping, age groups, gender, income, electronic commerce
Procedia PDF Downloads 3372582 A Comparative Analysis of Classification Models with Wrapper-Based Feature Selection for Predicting Student Academic Performance
Authors: Abdullah Al Farwan, Ya Zhang
Abstract:
In today’s educational arena, it is critical to understand educational data and be able to evaluate important aspects, particularly data on student achievement. Educational Data Mining (EDM) is a research area that focusing on uncovering patterns and information in data from educational institutions. Teachers, if they are able to predict their students' class performance, can use this information to improve their teaching abilities. It has evolved into valuable knowledge that can be used for a wide range of objectives; for example, a strategic plan can be used to generate high-quality education. Based on previous data, this paper recommends employing data mining techniques to forecast students' final grades. In this study, five data mining methods, Decision Tree, JRip, Naive Bayes, Multi-layer Perceptron, and Random Forest with wrapper feature selection, were used on two datasets relating to Portuguese language and mathematics classes lessons. The results showed the effectiveness of using data mining learning methodologies in predicting student academic success. The classification accuracy achieved with selected algorithms lies in the range of 80-94%. Among all the selected classification algorithms, the lowest accuracy is achieved by the Multi-layer Perceptron algorithm, which is close to 70.45%, and the highest accuracy is achieved by the Random Forest algorithm, which is close to 94.10%. This proposed work can assist educational administrators to identify poor performing students at an early stage and perhaps implement motivational interventions to improve their academic success and prevent educational dropout.Keywords: classification algorithms, decision tree, feature selection, multi-layer perceptron, Naïve Bayes, random forest, students’ academic performance
Procedia PDF Downloads 1662581 Efficient Field-Oriented Motor Control on Resource-Constrained Microcontrollers for Optimal Performance without Specialized Hardware
Authors: Nishita Jaiswal, Apoorv Mohan Satpute
Abstract:
The increasing demand for efficient, cost-effective motor control systems in the automotive industry has driven the need for advanced, highly optimized control algorithms. Field-Oriented Control (FOC) has established itself as the leading approach for motor control, offering precise and dynamic regulation of torque, speed, and position. However, as energy efficiency becomes more critical in modern applications, implementing FOC on low-power, cost-sensitive microcontrollers pose significant challenges due to the limited availability of computational and hardware resources. Currently, most solutions rely on high-performance 32-bit microcontrollers or Application-Specific Integrated Circuits (ASICs) equipped with Floating Point Units (FPUs) and Hardware Accelerated Units (HAUs). These advanced platforms enable rapid computation and simplify the execution of complex control algorithms like FOC. However, these benefits come at the expense of higher costs, increased power consumption, and added system complexity. These drawbacks limit their suitability for embedded systems with strict power and budget constraints, where achieving energy and execution efficiency without compromising performance is essential. In this paper, we present an alternative approach that utilizes optimized data representation and computation techniques on a 16-bit microcontroller without FPUs or HAUs. By carefully optimizing data point formats and employing fixed-point arithmetic, we demonstrate how the precision and computational efficiency required for FOC can be maintained in resource-constrained environments. This approach eliminates the overhead performance associated with floating-point operations and hardware acceleration, providing a more practical solution in terms of cost, scalability and improved execution time efficiency, allowing faster response in motor control applications. Furthermore, it enhances system design flexibility, making it particularly well-suited for applications that demand stringent control over power consumption and costs.Keywords: field-oriented control, fixed-point arithmetic, floating point unit, hardware accelerator unit, motor control systems
Procedia PDF Downloads 152580 Domestic Rooftop Rainwater Harvesting for Prevention of Urban Flood in the Gomti Nagar Region of Lucknow, Uttar Pradesh, India
Authors: Rajkumar Ghosh
Abstract:
Urban flooding is a common occurrence throughout Asia. Almost every city is vulnerable to urban floods in some fashion, and city people are particularly vulnerable. Pluvial and fluvial flooding are the most prominent causes of urban flooding in the Gomti Nagar region of Lucknow, Uttar Pradesh, India. The pluvial flooding is regarded to be less damaging because it is caused by heavy rainfall, Seasonal rainfall fluctuations, water flows off concrete infrastructures, blockages of the drainage system, and insufficient drainage capacity or low infiltration capacity. However, this study considers pluvial flooding in Lucknow to be a significant source of cumulative damage over time, and the risks of such events are increasing as a result of changes in ageing infrastructure, hazard exposure, rapid urbanization, massive water logging and global warming. As a result, urban flooding has emerged as a critical field of study. The popularity of analytical approaches to project the spatial extent of flood dangers has skyrocketed. To address future urban flood resilience, more effort is needed to enhance both hydrodynamic models and analytical tools to simulate risks under present and forecast conditions. Proper urban planning with drainage system and ample space for high infiltration capacity are required to reduce urban flooding. A better India with no urban flooding is a pipe dream that can be realized by putting household rooftop rainwater collection systems in every structure. According to the current study, domestic RTRWHs are strongly recommended as an alternative source of water, as well as to prevent surface runoff and urban floods in this region of Lucknow, urban areas of India.Keywords: rooftop rainwater harvesting, urban flood, pluvial flooding, fluvial flooding
Procedia PDF Downloads 852579 Realization of Sustainable Urban Society by Personal Electric Transporter and Natural Energy
Authors: Yuichi Miyamoto
Abstract:
In regards to the energy sector in the modern period, two points were raised. First is a vast and growing energy demand, and second is an environmental impact associated with it. The enormous consumption of fossil fuel to the mobile unit is leading to its rapid depletion. Nuclear power is not the only problem. A modal shift that utilizes personal transporters and independent power, in order to realize a sustainable society, is very effective. The paper proposes that the world will continue to work on this. Energy of the future society, innovation in battery technology and the use of natural energy is a big key. And it is also necessary in order to save on energy consumption.Keywords: natural energy, modal shift, personal transportation, battery
Procedia PDF Downloads 4092578 Overview of Risk Management in Electricity Markets Using Financial Derivatives
Authors: Aparna Viswanath
Abstract:
Electricity spot prices are highly volatile under optimal generation capacity scenarios due to factors such as non-storability of electricity, peak demand at certain periods, generator outages, fuel uncertainty for renewable energy generators, huge investments and time needed for generation capacity expansion etc. As a result market participants are exposed to price and volume risk, which has led to the development of risk management practices. This paper provides an overview of risk management practices by market participants in electricity markets using financial derivatives.Keywords: financial derivatives, forward, futures, options, risk management
Procedia PDF Downloads 4792577 Buy-and-Hold versus Alternative Strategies: A Comparison of Market-Timing Techniques
Authors: Jonathan J. Burson
Abstract:
With the rise of virtually costless, mobile-based trading platforms, stock market trading activity has increased significantly over the past decade, particularly for the millennial generation. This increased stock market attention, combined with the recent market turmoil due to the economic upset caused by COVID-19, make the topics of market-timing and forecasting particularly relevant. While the overall stock market saw an unprecedented, historically-long bull market from March 2009 to February 2020, the end of that bull market reignited a search by investors for a way to reduce risk and increase return. Similar searches for outperformance occurred in the early, and late 2000’s as the Dotcom bubble burst and the Great Recession led to years of negative returns for mean-variance, index investors. Extensive research has been conducted on fundamental analysis, technical analysis, macroeconomic indicators, microeconomic indicators, and other techniques—all using different methodologies and investment periods—in pursuit of higher returns with lower risk. The enormous variety of timeframes, data, and methodologies used by the diverse forecasting methods makes it difficult to compare the outcome of each method directly to other methods. This paper establishes a process to evaluate the market-timing methods in an apples-to-apples manner based on simplicity, performance, and feasibility. Preliminary findings show that certain technical analysis models provide a higher return with lower risk when compared to the buy-and-hold method and to other market-timing strategies. Furthermore, technical analysis models tend to be easier for individual investors both in terms of acquiring the data and in analyzing it, making technical analysis-based market-timing methods the preferred choice for retail investors.Keywords: buy-and-hold, forecast, market-timing, probit, technical analysis
Procedia PDF Downloads 972576 Spatial Variation of WRF Model Rainfall Prediction over Uganda
Authors: Isaac Mugume, Charles Basalirwa, Daniel Waiswa, Triphonia Ngailo
Abstract:
Rainfall is a major climatic parameter affecting many sectors such as health, agriculture and water resources. Its quantitative prediction remains a challenge to weather forecasters although numerical weather prediction models are increasingly being used for rainfall prediction. The performance of six convective parameterization schemes, namely the Kain-Fritsch scheme, the Betts-Miller-Janjic scheme, the Grell-Deveny scheme, the Grell-3D scheme, the Grell-Fretas scheme, the New Tiedke scheme of the weather research and forecast (WRF) model regarding quantitative rainfall prediction over Uganda is investigated using the root mean square error for the March-May (MAM) 2013 season. The MAM 2013 seasonal rainfall amount ranged from 200 mm to 900 mm over Uganda with northern region receiving comparatively lower rainfall amount (200–500 mm); western Uganda (270–550 mm); eastern Uganda (400–900 mm) and the lake Victoria basin (400–650 mm). A spatial variation in simulated rainfall amount by different convective parameterization schemes was noted with the Kain-Fritsch scheme over estimating the rainfall amount over northern Uganda (300–750 mm) but also presented comparable rainfall amounts over the eastern Uganda (400–900 mm). The Betts-Miller-Janjic, the Grell-Deveny, and the Grell-3D underestimated the rainfall amount over most parts of the country especially the eastern region (300–600 mm). The Grell-Fretas captured rainfall amount over the northern region (250–450 mm) but also underestimated rainfall over the lake Victoria Basin (150–300 mm) while the New Tiedke generally underestimated rainfall amount over many areas of Uganda. For deterministic rainfall prediction, the Grell-Fretas is recommended for rainfall prediction over northern Uganda while the Kain-Fritsch scheme is recommended over eastern region.Keywords: convective parameterization schemes, March-May 2013 rainfall season, spatial variation of parameterization schemes over Uganda, WRF model
Procedia PDF Downloads 3112575 Household Water Practices in a Rapidly Urbanizing City and Its Implications for the Future of Potable Water: A Case Study of Abuja Nigeria
Authors: Emmanuel Maiyanga
Abstract:
Access to sufficiently good quality freshwater has been a global challenge, but more notably in low-income countries, particularly in the Sub-Saharan countries, which Nigeria is one. Urban population is soaring, especially in many low-income countries, the existing centralised water supply infrastructures are ageing and inadequate, moreover in households peoples’ lifestyles have become more water-demanding. So, people mostly device coping strategies where municipal supply is perceived to have failed. This development threatens the futures of groundwater and calls for a review of management strategy and research approach. The various issues associated with water demand management in low-income countries and Nigeria, in particular, are well documented in the literature. However, the way people use water daily in households and the reasons they do so, and how the situation is constructing demand among the middle-class population in Abuja Nigeria is poorly understood. This is what this research aims to unpack. This is achieved by using the social practices research approach (which is based on the Theory of Practices) to understand how this situation impacts on the shared groundwater resource. A qualitative method was used for data gathering. This involved audio-recorded interviews of householders and water professionals in the private and public sectors. It also involved observation, note-taking, and document study. The data were analysed thematically using NVIVO software. The research reveals the major household practices that draw on the water on a domestic scale, and they include water sourcing, body hygiene and sanitation, laundry, kitchen, and outdoor practices (car washing, domestic livestock farming, and gardening). Among all the practices, water sourcing, body hygiene, kitchen, and laundry practices, are identified to impact most on groundwater, with impact scale varying with household peculiarities. Water sourcing practices involve people sourcing mostly from personal boreholes because the municipal water supply is perceived inadequate and unreliable in terms of service delivery and water quality, and people prefer easier and unlimited access and control using boreholes. Body hygiene practices reveal that every respondent prefers bucket bathing at least once daily, and the majority bathe twice or more every day. Frequency is determined by the feeling of hotness and dirt on the skin. Thus, people bathe to cool down, stay clean, and satisfy perceived social, religious, and hygiene demand. Kitchen practice consumes water significantly as people run the tap for vegetable washing in daily food preparation and dishwashing after each meal. Laundry practice reveals that most people wash clothes most frequently (twice in a week) during hot and dusty weather, and washing with hands in basins and buckets is the most prevalent and water wasting due to soap overdose. The research also reveals poor water governance as a major cause of current inadequate municipal water delivery. The implication poor governance and widespread use of boreholes is an uncontrolled abstraction of groundwater to satisfy desired household practices, thereby putting the future of the shared aquifer at great risk of total depletion with attendant multiplying effects on the people and the environment and population continues to soar.Keywords: boreholes, groundwater, household water practices, self-supply
Procedia PDF Downloads 1232574 Analysis of Trends and Challenges of Using Renewable Biomass for Bioplastics
Authors: Namasivayam Navaranjan, Eric Dimla
Abstract:
The world needs more quality food, shelter and transportation to meet the demands of growing population and improving living standard of those who currently live below the poverty line. Materials are essential commodities for various applications including food and pharmaceutical packaging, building and automobile. Petroleum based plastics are widely used materials amongst others for these applications and their demand is expected to increase. Use of plastics has environment related issues because considerable amount of plastic used worldwide is disposed in landfills, where its resources are wasted, the material takes up valuable space and blights communities. Some countries have been implementing regulations and/or legislations to increase reuse, recycle, renew and remanufacture materials as well as to minimise the use of non-environmentally friendly materials such as petroleum plastics. However, issue of material waste is still a concern in the countries who have low environmental regulations. Development of materials, mostly bioplastics from renewable biomass resources has become popular in the last decade. It is widely believed that the potential for up to 90% substitution of total plastics consumption by bioplastics is technically possible. The global demand for bioplastics is estimated to be approximately six times larger than in 2010. Recently, standard polymers like polyethylene (PE), polypropylene (PP), Polyvinyl Chloride (PVC) or Polyethylene terephthalate (PET), but also high-performance polymers such as polyamides or polyesters have been totally or partially substituted by their renewable equivalents. An example is Polylactide (PLA) being used as a substitute in films and injection moulded products made of petroleum plastics, e.g. PET. The starting raw materials for bio-based materials are usually sugars or starches that are mostly derived from food resources, partially also recycled materials from food or wood processing. The risk in lower food availability by increasing price of basic grains as a result of competition with biomass-based product sectors for feedstock also needs to be considered for the future bioplastic production. Manufacturing of bioplastic materials is often still reliant upon petroleum as an energy and materials source. Life Cycle Assessment (LCA) of bioplastic products has being conducted to determine the sustainability of a production route. However, the accuracy of LCA depends on several factors and needs improvement. Low oil price and high production cost may also limit the technically possible growth of these plastics in the coming years.Keywords: bioplastics, plastics, renewable resources, biomass
Procedia PDF Downloads 3082573 Practical Challenges of Tunable Parameters in Matlab/Simulink Code Generation
Authors: Ebrahim Shayesteh, Nikolaos Styliaras, Alin George Raducu, Ozan Sahin, Daniel Pombo VáZquez, Jonas Funkquist, Sotirios Thanopoulos
Abstract:
One of the important requirements in many code generation projects is defining some of the model parameters tunable. This helps to update the model parameters without performing the code generation again. This paper studies the concept of embedded code generation by MATLAB/Simulink coder targeting the TwinCAT Simulink system. The generated runtime modules are then tested and deployed to the TwinCAT 3 engineering environment. However, defining the parameters tunable in MATLAB/Simulink code generation targeting TwinCAT is not very straightforward. This paper focuses on this subject and reviews some of the techniques tested here to make the parameters tunable in generated runtime modules. Three techniques are proposed for this purpose, including normal tunable parameters, callback functions, and mask subsystems. Moreover, some test Simulink models are developed and used to evaluate the results of proposed approaches. A brief summary of the study results is presented in the following. First of all, the parameters defined tunable and used in defining the values of other Simulink elements (e.g., gain value of a gain block) could be changed after the code generation and this value updating will affect the values of all elements defined based on the values of the tunable parameter. For instance, if parameter K=1 is defined as a tunable parameter in the code generation process and this parameter is used to gain a gain block in Simulink, the gain value for the gain block is equal to 1 in the gain block TwinCAT environment after the code generation. But, the value of K can be changed to a new value (e.g., K=2) in TwinCAT (without doing any new code generation in MATLAB). Then, the gain value of the gain block will change to 2. Secondly, adding a callback function in the form of “pre-load function,” “post-load function,” “start function,” and will not help to make the parameters tunable without performing a new code generation. This means that any MATLAB files should be run before performing the code generation. The parameters defined/calculated in this file will be used as fixed values in the generated code. Thus, adding these files as callback functions to the Simulink model will not make these parameters flexible since the MATLAB files will not be attached to the generated code. Therefore, to change the parameters defined/calculated in these files, the code generation should be done again. However, adding these files as callback functions forces MATLAB to run them before the code generation, and there is no need to define the parameters mentioned in these files separately. Finally, using a tunable parameter in defining/calculating the values of other parameters through the mask is an efficient method to change the value of the latter parameters after the code generation. For instance, if tunable parameter K is used in calculating the value of two other parameters K1 and K2 and, after the code generation, the value of K is updated in TwinCAT environment, the value of parameters K1 and K2 will also be updated (without any new code generation).Keywords: code generation, MATLAB, tunable parameters, TwinCAT
Procedia PDF Downloads 2282572 An Analytical Review of Tourism Management in India with Special Reference to Maharashtra State
Authors: Anilkumar L. Rathod
Abstract:
This paper examines event tourism as a field of study and area of professional practice updating the previous review article published in 2015. In this substantially extended review, a deeper analysis of the field's evolution and development is presented, charting the growth of the literature, focusing both chronologically and thematically. A framework for understanding and creating knowledge about events and tourism is presented, forming the basis which signposts established research themes and concepts and outlines future directions for research. In addition, the review article focuses on constraining and propelling forces, ontological advances, contributions from key journals, and emerging themes and issues. It also presents a roadmap for research activity in event tourism. Published scholarly studies within this period are examined through content analysis, using such keywords as knowledge management, organizational learning, hospitality, tourism, tourist destinations, travel industry, hotels, lodging, motels, hotel industry, gaming, casino hotel and convention to search scholarly research journals. All contributions found are then screened for a hospitality and tourism theme. Researchers mostly discuss knowledge management approach in improving information technology, marketing and strategic planning in order to gain competitive advantage. Overall, knowledge management research is still limited. Planned events in tourism are created for a purpose, and what was once the realm of individual and community initiatives has largely become the realm of professionals and entrepreneurs provides a typology of the four main categories of planned events within an event-tourism context, including the main venues associated with each. It also assesses whether differences exist between socio-demographic groupings. An analysis using primarily descriptive statistics indicated both sub-samples had similar viewpoints although Maharashtra residents tended to have higher scores pertaining to the consequences of gambling. It is suggested that the differences arise due to the greater exposure of Maharashtra residents to the influences of casino development.Keywords: organizational learning, hospitality, tourism, tourist destinations, travel industry, hotels, lodging, motels, hotel industry, gaming, casino hotel and convention to search scholarly research journals
Procedia PDF Downloads 2382571 DeepLig: A de-novo Computational Drug Design Approach to Generate Multi-Targeted Drugs
Authors: Anika Chebrolu
Abstract:
Mono-targeted drugs can be of limited efficacy against complex diseases. Recently, multi-target drug design has been approached as a promising tool to fight against these challenging diseases. However, the scope of current computational approaches for multi-target drug design is limited. DeepLig presents a de-novo drug discovery platform that uses reinforcement learning to generate and optimize novel, potent, and multitargeted drug candidates against protein targets. DeepLig’s model consists of two networks in interplay: a generative network and a predictive network. The generative network, a Stack- Augmented Recurrent Neural Network, utilizes a stack memory unit to remember and recognize molecular patterns when generating novel ligands from scratch. The generative network passes each newly created ligand to the predictive network, which then uses multiple Graph Attention Networks simultaneously to forecast the average binding affinity of the generated ligand towards multiple target proteins. With each iteration, given feedback from the predictive network, the generative network learns to optimize itself to create molecules with a higher average binding affinity towards multiple proteins. DeepLig was evaluated based on its ability to generate multi-target ligands against two distinct proteins, multi-target ligands against three distinct proteins, and multi-target ligands against two distinct binding pockets on the same protein. With each test case, DeepLig was able to create a library of valid, synthetically accessible, and novel molecules with optimal and equipotent binding energies. We propose that DeepLig provides an effective approach to design multi-targeted drug therapies that can potentially show higher success rates during in-vitro trials.Keywords: drug design, multitargeticity, de-novo, reinforcement learning
Procedia PDF Downloads 972570 A Multi-Criteria Decision Making Approach for Disassembly-To-Order Systems under Uncertainty
Authors: Ammar Y. Alqahtani
Abstract:
In order to minimize the negative impact on the environment, it is essential to manage the waste that generated from the premature disposal of end-of-life (EOL) products properly. Consequently, government and international organizations introduced new policies and regulations to minimize the amount of waste being sent to landfills. Moreover, the consumers’ awareness regards environment has forced original equipment manufacturers to consider being more environmentally conscious. Therefore, manufacturers have thought of different ways to deal with waste generated from EOL products viz., remanufacturing, reusing, recycling, or disposing of EOL products. The rate of depletion of virgin natural resources and their dependency on the natural resources can be reduced by manufacturers when EOL products are treated as remanufactured, reused, or recycled, as well as this will cut on the amount of harmful waste sent to landfills. However, disposal of EOL products contributes to the problem and therefore is used as a last option. Number of EOL need to be estimated in order to fulfill the components demand. Then, disassembly process needs to be performed to extract individual components and subassemblies. Smart products, built with sensors embedded and network connectivity to enable the collection and exchange of data, utilize sensors that are implanted into products during production. These sensors are used for remanufacturers to predict an optimal warranty policy and time period that should be offered to customers who purchase remanufactured components and products. Sensor-provided data can help to evaluate the overall condition of a product, as well as the remaining lives of product components, prior to perform a disassembly process. In this paper, a multi-period disassembly-to-order (DTO) model is developed that takes into consideration the different system uncertainties. The DTO model is solved using Nonlinear Programming (NLP) in multiple periods. A DTO system is considered where a variety of EOL products are purchased for disassembly. The model’s main objective is to determine the best combination of EOL products to be purchased from every supplier in each period which maximized the total profit of the system while satisfying the demand. This paper also addressed the impact of sensor embedded products on the cost of warranties. Lastly, this paper presented and analyzed a case study involving various simulation conditions to illustrate the applicability of the model.Keywords: closed-loop supply chains, environmentally conscious manufacturing, product recovery, reverse logistics
Procedia PDF Downloads 1372569 Adopting the Community Health Workers Master List Registry for Community Health Workforce in Kenya
Authors: Gikunda Aloise, Mjema Saida, Barasa Herbert, Wanyungu John, Kimani Maureen
Abstract:
Background: Community Health Workforce (CHW) is health care providers at the community level (Level 1) and serves as a bridge between the community and the formal healthcare system. This human resource has enormous potential to extend healthcare services and ensures that the vulnerable, marginalized, and hard-to-reach populations have access to quality healthcare services at the community and primary health facility levels. However, these cadres are neither recognized, remunerated, nor in most instances, registered in a master list. Management and supervision of CHWs is not easy if their individual demographics, training capacity and incentives is not well documented through a centralized registry. Description: In February 2022, Amref supported the Kenya Ministry of Health in developing a community health workforce database called Community Health Workers Master List Registry (CHWML), which is hosted in Kenya Health Information System (KHIS) tracker. CHW registration exercise was through a sensitization meeting conducted by the County Community Health Focal Person for the Sub-County Community Health Focal Person and Community Health Assistants who uploaded information on individual demographics, training undertaken and incentives received by CHVs. Care was taken to ensure compliance with Kenyan laws on the availability and use of personal data as prescribed by the Data Protection Act, 2019 (DPA). Results and lessons learnt: By June 2022, 80,825 CHWs had been registered in the system; 78,174 (96%) CHVs and 2,636 (4%) CHAs. 25,235 (31%) are male, 55,505 (68%) are female & 85 (1%) are transgender. 39,979. (49%) had secondary education and 2500 (3%) had no formal education. Only 27 641 (34%) received a monthly stipend. 68,436 CHVs (85%) had undergone basic training. However, there is a need to validate the data to align with the current situation in the counties. Conclusions/Next steps: The use of CHWML will unlock opportunities for building more resilient and sustainable health systems and inform financial planning, resource allocation, capacity development, and quality service delivery. The MOH will update the CHWML guidelines in adherence to the data protection act which will inform standard procedures for maintaining, updating the registry and integrate Community Health Workforce registry with the HRH system.Keywords: community health registry, community health volunteers (CHVs), community health workers masters list (CHWML), data protection act
Procedia PDF Downloads 1402568 Analysis of Decentralized on Demand Cross Layer in Cognitive Radio Ad Hoc Network
Authors: A. Sri Janani, K. Immanuel Arokia James
Abstract:
Cognitive radio ad hoc networks different unlicensed users may acquire different available channel sets. This non-uniform spectrum availability imposes special design challenges for broadcasting in CR ad hoc networks. Cognitive radio automatically detects available channels in wireless spectrum. This is a form of dynamic spectrum management. Cross-layer optimization is proposed, using this can allow far away secondary users can also involve into channel work. So it can increase the throughput and it will overcome the collision and time delay.Keywords: cognitive radio, cross layer optimization, CR mesh network, heterogeneous spectrum, mesh topology, random routing optimization technique
Procedia PDF Downloads 3592567 An Analysis of Relation Between Soil Radon Anomalies and Geological Environment Change
Authors: Mengdi Zhang, Xufeng Liu, Zhenji Gao, Ying Li, Zhu Rao, Yi Huang
Abstract:
As an open system, the earth is constantly undergoing the transformation and release of matter and energy. Fault zones are relatively discontinuous and fragile geological structures, and the release of material and energy inside the Earth is strongest in relatively weak fault zones. Earthquake events frequently occur in fault zones and are closely related to tectonic activity in these zones. In earthquake precursor observation, monitoring the spatiotemporal changes in the release of related gases near fault zones (such as radon gas, hydrogen, carbon dioxide, helium), and analyzing earthquake precursor anomalies, can be effective means to forecast the occurrence of earthquake events. Radon gas, as an inert radioactive gas generated during the decay of uranium and thorium, is not only a indicator for monitoring tectonic and seismic activity, but also an important topic for ecological and environmental health, playing a crucial role in uranium exploration. At present, research on soil radon gas mainly focuses on the measurement of soil gas concentration and flux in fault zone profiles, while research on the correlation between spatiotemporal concentration changes in the same region and its geological background is relatively little. In this paper, Tangshan area in north China is chosen as research area. An analysis was conducted on the seismic geological background of Tangshan area firstly. Then based on quantitative analysis and comparison of measurement radon concentrations of 2023 and 2010, combined with the study of seismic activity and environmental changes during the time period, the spatiotemporal distribution characteristics and influencing factors were explored, in order to analyze the gas emission characteristics of the Tangshan fault zone and its relationship with fault activity, which aimed to be useful for the future work in earthquake monitor of Tangshan area.Keywords: radon, Northern China, soil gas, earthquake
Procedia PDF Downloads 822566 An Optimization Modelling to Evaluate Flights Scheduling at Tourist Airports
Authors: Dimitrios J. Dimitriou
Abstract:
Airport’s serving a tourist destination are an essential counterpart of the tourist demand supply chain, and their productivity is related to the region’s attractiveness and is enhanced by the air transport business. In this paper, the evaluation framework of the scheduled flights between two tourist airports is taken into consideration. By adopting a systemic approach, the arrivals from an airport that its connectivity heavily depended on the departures of another major airport are reviewed. The methodology framework, based on inventory control theory and the numerical example, promotes the use of the modelling formulation. The results would be essential for comparison and exercising to other similar cases.Keywords: airport connectivity, inventory control, optimization, optimum allocation
Procedia PDF Downloads 3342565 Technology Futures in Global Militaries: A Forecasting Method Using Abstraction Hierarchies
Authors: Mark Andrew
Abstract:
Geopolitical tensions are at a thirty-year high, and the pace of technological innovation is driving asymmetry in force capabilities between nation states and between non-state actors. Technology futures are a vital component of defence capability growth, and investments in technology futures need to be informed by accurate and reliable forecasts of the options for ‘systems of systems’ innovation, development, and deployment. This paper describes a method for forecasting technology futures developed through an analysis of four key systems’ development stages, namely: technology domain categorisation, scanning results examining novel systems’ signals and signs, potential system-of systems’ implications in warfare theatres, and political ramifications in terms of funding and development priorities. The method has been applied to several technology domains, including physical systems (e.g., nano weapons, loitering munitions, inflight charging, and hypersonic missiles), biological systems (e.g., molecular virus weaponry, genetic engineering, brain-computer interfaces, and trans-human augmentation), and information systems (e.g., sensor technologies supporting situation awareness, cyber-driven social attacks, and goal-specification challenges to proliferation and alliance testing). Although the current application of the method has been team-centred using paper-based rapid prototyping and iteration, the application of autonomous language models (such as GPT-3) is anticipated as a next-stage operating platform. The importance of forecasting accuracy and reliability is considered a vital element in guiding technology development to afford stronger contingencies as ideological changes are forecast to expand threats to ecology and earth systems, possibly eclipsing the traditional vulnerabilities of nation states. The early results from the method will be subjected to ground truthing using longitudinal investigation.Keywords: forecasting, technology futures, uncertainty, complexity
Procedia PDF Downloads 1152564 “Japan’s New Security Outlook: Implications for the US-Japan Alliance”
Authors: Agustin Maciel-Padilla
Abstract:
This paper explores the most significant change to Japan’s security strategy since the end of World War II, in particular Prime Minister Fumio Kishida’s government publication, in late 2022, of 3 policy documents (the National Security Strategy [NSS], the National Defense Strategy and the Defense Buildup Program) that basically propose to expand the country’s military capabilities and to increase military spending over a 5-year period. These policies represent a remarkable transformation of Japan’s defense-oriented policy followed since 1946. These proposals have been under analysis and debate since they were announced, as it was also Japan’s historic ambition to strengthening its deterrence capabilities in the context of a more complex regional security environment. Even though this new defense posture has attracted significant international attention, it is far from representing a done deal because of the fact that there is still a long way to go to implement this vision because of a wide variety of political and economic issues. Japan is currently experiencing the most dangerous security environment since the end of World War II, and this situation led Japan to intensify its dialogue with the United States to reflect a re-evaluation of deterrence in the face of a rapidly worsening security environment, a changing balance of power in East Asia, and the arrival of a new era of “great power competition”. Japan’s new documents, for instance, identify China and North Korea’s as posing, respectively, a strategic challenge and an imminent threat. Japan has also noted that Russia’s invasion of Ukraine has contributed to erode the foundation of the international order. It is considered that Russia’s aggression was possible because Ukraine’s defense capability was not enough for effective deterrence. Moreover, Japan’s call for “counterstrike capabilities” results from a recognition that China and North Korea’s ballistic and cruise missiles could overwhelm Japan’s air and missile defense systems, and therefore there is an urgent need to strengthen deterrence and resilience. In this context, this paper will focus on the impact of these changes on the US-Japan alliance. Adapting this alliance to Tokyo’s new ambitions and capabilities could be critical in terms of updating their traditional protection/access to bases arrangement, interoperability and joint command and control issues, as well as regarding the security–economy nexus. While China is Japan’s largest trading partner, and trade between the two has been growing, US-Japan economic relationship has been slower, notwithstanding the fact that US-Japan security cooperation has strengthened significantly in recent years.Keywords: us-japan alliance, japan security, great power competition, interoperability
Procedia PDF Downloads 652563 Wastewater Treatment in the Abrasives Industry via Fenton and Photo-Fenton Oxidation Processes: A Case Study from Peru
Authors: Hernan Arturo Blas López, Gustavo Henndel Lopes, Antonio Carlos Silva Costa Teixeira, Carmen Elena Flores Barreda, Patricia Araujo Pantoja
Abstract:
Phenols are toxic for life and the environment and may come from many sources. Uncured phenolic monomers present in phenolic resins used as binders in grinding wheels and emery paper can contaminate industrial wastewaters in abrasives manufacture plants. Furthermore, vestiges of resol and novolacs resins generated by wear and tear of abrasives are also possible sources of water contamination by phenolics in these facilities. Fortunately, advanced oxidation by dark Fenton and photo-Fenton techniques are capable of oxidizing phenols and their degradation products up to their mineralization into H₂O and CO₂. The maximal allowable concentrations for phenols in Peruvian waterbodies is very low, such that insufficiently treated effluents from the abrasives industry are a potential environmental noncompliance. The current case study highlights findings obtained during the lab-scale application of Fenton’s and photo-assisted Fenton’s chemistries to real industrial wastewater samples from an abrasives manufacture plant in Peru. The goal was to reduce the phenolic content and sample toxicity. For this purpose, two independent variables-reaction time and effect of ultraviolet radiation–were studied as for their impacts on the concentration of total phenols, total organic carbon (TOC), biological oxygen demand (BOD) and chemical oxygen demand (COD). In this study, diluted samples (1 L) of the industrial effluent were treated with Fenton’s reagent (H₂O₂ and Fe²⁺ from FeSO₄.H₂O) during 10 min in a photochemical batch reactor (Alphatec RFS-500, Brazil) at pH 2.92. In the case of photo-Fenton tests with ultraviolet lamps of 9 W, UV-A, UV-B and UV-C lamps were evaluated. All process conditions achieved 100% of phenols degraded within 5 minutes. TOC, BOD and COD decreased by 49%, 52% and 86% respectively (all processes together). However, Fenton treatment was not capable of reducing BOD, COD and TOC below a certain value even after 10 minutes, contrarily to photo-Fenton. It was also possible to conclude that the processes here studied degrade other compounds in addition to phenols, what is an advantage. In all cases, elevated effluent dilution factors and high amounts of oxidant agent impact negatively the overall economy of the processes here investigated.Keywords: fenton oxidation, wastewater treatment, phenols, abrasives industry
Procedia PDF Downloads 3142562 Long Short-Term Memory Stream Cruise Control Method for Automated Drift Detection and Adaptation
Authors: Mohammad Abu-Shaira, Weishi Shi
Abstract:
Adaptive learning, a commonly employed solution to drift, involves updating predictive models online during their operation to react to concept drifts, thereby serving as a critical component and natural extension for online learning systems that learn incrementally from each example. This paper introduces LSTM-SCCM “Long Short-Term Memory Stream Cruise Control Method”, a drift adaptation-as-a-service framework for online learning. LSTM-SCCM automates drift adaptation through prompt detection, drift magnitude quantification, dynamic hyperparameter tuning, performing shortterm optimization and model recalibration for immediate adjustments, and, when necessary, conducting long-term model recalibration to ensure deeper enhancements in model performance. LSTM-SCCM is incorporated into a suite of cutting-edge online regression models, assessing their performance across various types of concept drift using diverse datasets with varying characteristics. The findings demonstrate that LSTM-SCCM represents a notable advancement in both model performance and efficacy in handling concept drift occurrences. LSTM-SCCM stands out as the sole framework adept at effectively tackling concept drifts within regression scenarios. Its proactive approach to drift adaptation distinguishes it from conventional reactive methods, which typically rely on retraining after significant degradation to model performance caused by drifts. Additionally, LSTM-SCCM employs an in-memory approach combined with the Self-Adjusting Memory (SAM) architecture to enhance real-time processing and adaptability. The framework incorporates variable thresholding techniques and does not assume any particular data distribution, making it an ideal choice for managing high-dimensional datasets and efficiently handling large-scale data. Our experiments, which include abrupt, incremental, and gradual drifts across both low- and high-dimensional datasets with varying noise levels, and applied to four state-of-the-art online regression models, demonstrate that LSTM-SCCM is versatile and effective, rendering it a valuable solution for online regression models to address concept drift.Keywords: automated drift detection and adaptation, concept drift, hyperparameters optimization, online and adaptive learning, regression
Procedia PDF Downloads 122561 IoT Continuous Monitoring Biochemical Oxygen Demand Wastewater Effluent Quality: Machine Learning Algorithms
Authors: Sergio Celaschi, Henrique Canavarro de Alencar, Claaudecir Biazoli
Abstract:
Effluent quality is of the highest priority for compliance with the permit limits of environmental protection agencies and ensures the protection of their local water system. Of the pollutants monitored, the biochemical oxygen demand (BOD) posed one of the greatest challenges. This work presents a solution for wastewater treatment plants - WWTP’s ability to react to different situations and meet treatment goals. Delayed BOD5 results from the lab take 7 to 8 analysis days, hindered the WWTP’s ability to react to different situations and meet treatment goals. Reducing BOD turnaround time from days to hours is our quest. Such a solution is based on a system of two BOD bioreactors associated with Digital Twin (DT) and Machine Learning (ML) methodologies via an Internet of Things (IoT) platform to monitor and control a WWTP to support decision making. DT is a virtual and dynamic replica of a production process. DT requires the ability to collect and store real-time sensor data related to the operating environment. Furthermore, it integrates and organizes the data on a digital platform and applies analytical models allowing a deeper understanding of the real process to catch sooner anomalies. In our system of continuous time monitoring of the BOD suppressed by the effluent treatment process, the DT algorithm for analyzing the data uses ML on a chemical kinetic parameterized model. The continuous BOD monitoring system, capable of providing results in a fraction of the time required by BOD5 analysis, is composed of two thermally isolated batch bioreactors. Each bioreactor contains input/output access to wastewater sample (influent and effluent), hydraulic conduction tubes, pumps, and valves for batch sample and dilution water, air supply for dissolved oxygen (DO) saturation, cooler/heater for sample thermal stability, optical ODO sensor based on fluorescence quenching, pH, ORP, temperature, and atmospheric pressure sensors, local PLC/CPU for TCP/IP data transmission interface. The dynamic BOD system monitoring range covers 2 mg/L < BOD < 2,000 mg/L. In addition to the BOD monitoring system, there are many other operational WWTP sensors. The CPU data is transmitted/received to/from the digital platform, which in turn performs analyses at periodic intervals, aiming to feed the learning process. BOD bulletins and their credibility intervals are made available in 12-hour intervals to web users. The chemical kinetics ML algorithm is composed of a coupled system of four first-order ordinary differential equations for the molar masses of DO, organic material present in the sample, biomass, and products (CO₂ and H₂O) of the reaction. This system is solved numerically linked to its initial conditions: DO (saturated) and initial products of the kinetic oxidation process; CO₂ = H₂0 = 0. The initial values for organic matter and biomass are estimated by the method of minimization of the mean square deviations. A real case of continuous monitoring of BOD wastewater effluent quality is being conducted by deploying an IoT application on a large wastewater purification system located in S. Paulo, Brazil.Keywords: effluent treatment, biochemical oxygen demand, continuous monitoring, IoT, machine learning
Procedia PDF Downloads 732560 Privacy Policy Prediction for Uploaded Image on Content Sharing Sites
Authors: Pallavi Mane, Nikita Mankar, Shraddha Mazire, Rasika Pashankar
Abstract:
Content sharing sites are very useful in sharing information and images. However, with the increasing demand of content sharing sites privacy and security concern have also increased. There is need to develop a tool for controlling user access to their shared content. Therefore, we are developing an Adaptive Privacy Policy Prediction (A3P) system which is helpful for users to create privacy settings for their images. We propose the two-level framework which assigns the best available privacy policy for the users images according to users available histories on the site.Keywords: online information services, prediction, security and protection, web based services
Procedia PDF Downloads 3582559 A Review of Travel Data Collection Methods
Authors: Muhammad Awais Shafique, Eiji Hato
Abstract:
Household trip data is of crucial importance for managing present transportation infrastructure as well as to plan and design future facilities. It also provides basis for new policies implemented under Transportation Demand Management. The methods used for household trip data collection have changed with passage of time, starting with the conventional face-to-face interviews or paper-and-pencil interviews and reaching to the recent approach of employing smartphones. This study summarizes the step-wise evolution in the travel data collection methods. It provides a comprehensive review of the topic, for readers interested to know the changing trends in the data collection field.Keywords: computer, smartphone, telephone, travel survey
Procedia PDF Downloads 3132558 Stability of Pump Station Cavern in Chagrin Shale with Time
Authors: Mohammad Moridzadeh, Mohammad Djavid, Barry Doyle
Abstract:
An assessment of the long-term stability of a cavern in Chagrin shale excavated by the sequential excavation method was performed during and after construction. During the excavation of the cavern, deformations of rock mass were measured at the surface of excavation and within the rock mass by surface and deep measurement instruments. Rock deformations were measured during construction which appeared to result from the as-built excavation sequence that had potentially disturbed the rock and its behavior. Also some additional time dependent rock deformations were observed during and post excavation. Several opinions have been expressed to explain this time dependent deformation including stress changes induced by excavation, strain softening (or creep) in the beddings with and without clay and creep of the shaley rock under compressive stresses. In order to analyze and replicate rock behavior observed during excavation, including current and post excavation elastic, plastic, and time dependent deformation, Finite Element Analysis (FEA) was performed. The analysis was also intended to estimate long term deformation of the rock mass around the excavation. Rock mass behavior including time dependent deformation was measured by means of rock surface convergence points, MPBXs, extended creep testing on the long anchors, and load history data from load cells attached to several long anchors. Direct creep testing of Chagrin Shale was performed on core samples from the wall of the Pump Room. Results of these measurements were used to calibrate the FEA of the excavation. These analyses incorporate time dependent constitutive modeling for the rock to evaluate the potential long term movement in the roof, walls, and invert of the cavern. The modeling was performed due to the concerns regarding the unanticipated behavior of the rock mass as well as the forecast of long term deformation and stability of rock around the excavation.Keywords: Cavern, Chagrin shale, creep, finite element.
Procedia PDF Downloads 3522557 Offshorability and the Lobby for Immigrant Labor
Authors: Ellen A. Holtmaat
Abstract:
Research on lobbying for immigration is limited and the influence of offshorability on lobbying for immigration has not extensively been assessed. This research focuses on the U.S. and argues that offshorable firms have an ‘outside-option’ when they are in need of labor, which makes them less likely to lobby for immigration in the lower-skilled sectors. Higher-skilled offshorable sectors settle often in the U.S., as the U.S. has a comparative advantage in these sectors. The companies compete globally and demand world’s best labor, which induces them to lobby for immigration. This relationship is assessed using lobby data available from the 1995 Lobby Disclosure Act. Some evidence of the relationship is found and the research suggests that offshorability might also in general influence lobbying.Keywords: immigration, lobbying, non-tradable sector, offshoring
Procedia PDF Downloads 2922556 A Review on Geomembrane Characteristics and Application in Geotechnical Engineering
Authors: Sandra Ghavam Shirazi, Komeil Valipourian, Mohammad Reza Golhashem
Abstract:
This paper represents the basic idea and mechanisms associated with the durability of geomembranes and discusses the factors influencing the service life and temperature of geomembrane liners. Geomembrane durability is stated as field performance and laboratory test outcomes under various conditions. Due to the high demand of geomembranes as landfill barriers and their crucial role in sensitive projects, sufficient service life of geomembranes is very important, therefore in this paper, the durability, the effect of temperature on geomembrane and the role of this type of reinforcement in different types of soil will be discussed. Also, the role of geomembrane in the earthquake will be considered in the last part of the paper.Keywords: geomembrane, durability temperature soil mechanic, soil
Procedia PDF Downloads 3092555 The Role of Speed Reduction Model in Urban Highways Tunnels Accidents
Authors: Khashayar Kazemzadeh, Mohammad Hanif Dasoomi
Abstract:
According to the increasing travel demand in cities, bridges and tunnels are viewed as one of the fundamental components of cities transportation systems. Normally, due to geometric constraints forms in the tunnels, the considered speed in the tunnels is lower than the speed in connected highways. Therefore, drivers tend to reduce the speed near the entrance of the tunnels. In this paper, the effect of speed reduction on accident happened in the entrance of the tunnels has been discussed. The relation between accidents frequency and the parameters of speed, traffic volume and time of the accident in the mentioned tunnel has been analyzed and the mathematical model has been proposed.Keywords: urban highway, accident, tunnel, mathematical model
Procedia PDF Downloads 4722554 The Relevance of Intellectual Capital: An Analysis of Spanish Universities
Authors: Yolanda Ramirez, Angel Tejada, Agustin Baidez
Abstract:
In recent years, the intellectual capital reporting in higher education institutions has been acquiring progressive importance worldwide. Intellectual capital approaches becomes critical at universities, mainly due to the fact that knowledge is the main output as well as input in these institutions. Universities produce knowledge, either through scientific and technical research (the results of investigation, publications, etc.) or through teaching (students trained and productive relationships with their stakeholders). The purpose of the present paper is to identify the intangible elements about which university stakeholders demand most information. The results of a study done at Spanish universities are used to see which groups of universities have stakeholders who are more proactive to the disclosure of intellectual capital.Keywords: intellectual capital, universities, Spain, cluster analysis
Procedia PDF Downloads 509