Search results for: neuromorphic computing systems
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10066

Search results for: neuromorphic computing systems

7546 Geospatial Modeling of Dry Snow Avalanches Distribution Using Geographic Information Systems and Remote Sensing: A Case Study of the Šar Mountains (Balkan Peninsula)

Authors: Uroš Durlević, Ivan Novković, Nina Čegar, Stefanija Stojković

Abstract:

Snow avalanches represent one of the most dangerous natural phenomena in mountain regions worldwide. Material and human casualties caused by snow avalanches can be very significant. In this study, using geographic information systems and remote sensing, the natural conditions of the Šar Mountains were analyzed for geospatial modeling of dry slab avalanches. For this purpose, the Fuzzy Analytic Hierarchy Process (FAHP) multi-criteria analysis method was used, within which fifteen environmental criteria were analyzed and evaluated. Based on the existing analyzes and results, it was determined that a significant area of the Šar Mountains is very highly susceptible to the occurrence of dry slab avalanches. The obtained data can be of significant use to local governments, emergency services, and other institutions that deal with natural disasters at the local level. To our best knowledge, this is one of the first research in the Republic of Serbia that uses the FAHP method for geospatial modeling of dry slab avalanches.

Keywords: GIS, FAHP, Šar Mountains, snow avalanches, environmental protection

Procedia PDF Downloads 92
7545 Reducing Change-Related Costs in Assembly of Lithium-Ion Batteries for Electric Cars by Mechanical Decoupling

Authors: Achim Kampker, Heiner Hans Heimes, Mathias Ordung, Nemanja Sarovic

Abstract:

A key component of the drive train of electric vehicles is the lithium-ion battery system. Among various other components, such as the battery management system or the thermal management system, the battery system mostly consists of several cells which are integrated mechanically as well as electrically. Due to different vehicle concepts with regards to space, energy and power specifications, there is a variety of different battery systems. The corresponding assembly lines are specially designed for each battery concept. Minor changes to certain characteristics of the battery have a disproportionally high effect on the set-up effort in the form of high change-related costs. This paper will focus on battery systems which are made out of battery cells with a prismatic format. The product architecture and the assembly process will be analyzed in detail based on battery concepts of existing electric cars and key variety-causing drivers will be identified. On this basis, several measures will be presented and discussed on how to change the product architecture and the assembly process in order to reduce change-related costs.

Keywords: assembly, automotive industry, battery system, battery concept

Procedia PDF Downloads 306
7544 Advancing Horizons: Standardized Future Trends in LiDAR and Remote Sensing Technologies

Authors: Spoorthi Sripad

Abstract:

Rapid advancements in LiDAR (Light Detection and Ranging) technology, coupled with the synergy of remote sensing, have revolutionized Earth observation methodologies. This paper delves into the transformative impact of integrated LiDAR and remote sensing systems. Focusing on miniaturization, cost reduction, and improved resolution, the study explores the evolving landscape of terrestrial and aquatic environmental monitoring. The integration of multi-wavelength and dual-mode LiDAR systems, alongside collaborative efforts with other remote sensing technologies, presents a comprehensive approach. The paper highlights the pivotal role of LiDAR in environmental assessment, urban planning, and infrastructure development. As the amalgamation of LiDAR and remote sensing reshapes Earth observation, this research anticipates a paradigm shift in our understanding of dynamic planetary processes.

Keywords: LiDAR, remote sensing, earth observation, advancements, integration, environmental monitoring, multi-wavelength, dual-mode, technology, urban planning, infrastructure, resolution, miniaturization

Procedia PDF Downloads 83
7543 Survey of Methods for Solutions of Spatial Covariance Structures and Their Limitations

Authors: Joseph Thomas Eghwerido, Julian I. Mbegbu

Abstract:

In modelling environment processes, we apply multidisciplinary knowledge to explain, explore and predict the Earth's response to natural human-induced environmental changes. Thus, the analysis of spatial-time ecological and environmental studies, the spatial parameters of interest are always heterogeneous. This often negates the assumption of stationarity. Hence, the dispersion of the transportation of atmospheric pollutants, landscape or topographic effect, weather patterns depends on a good estimate of spatial covariance. The generalized linear mixed model, although linear in the expected value parameters, its likelihood varies nonlinearly as a function of the covariance parameters. As a consequence, computing estimates for a linear mixed model requires the iterative solution of a system of simultaneous nonlinear equations. In other to predict the variables at unsampled locations, we need to know the estimate of the present sampled variables. The geostatistical methods for solving this spatial problem assume covariance stationarity (locally defined covariance) and uniform in space; which is not apparently valid because spatial processes often exhibit nonstationary covariance. Hence, they have globally defined covariance. We shall consider different existing methods of solutions of spatial covariance of a space-time processes at unsampled locations. This stationary covariance changes with locations for multiple time set with some asymptotic properties.

Keywords: parametric, nonstationary, Kernel, Kriging

Procedia PDF Downloads 255
7542 Transdermal Delivery of Sodium Diclofenac from Palm Kernel Oil Esteres Nanoemulsions

Authors: Malahat Rezaee, Mahiran Basri, Abu Bakar Salleh, Raja Noor Zaliha Raja Abdul Rahman

Abstract:

Sodium diclofenac is one of the most commonly used drugs of nonsteroidal anti-inflammatory drugs (NSAIDs). It is especially effective in the controlling the severe conditions of inflammation and pain, musculoskeletal disorders, arthritis, and dysmenorrhea. Formulation as nanoemulsions is one of the nanoscience approaches that has been progressively considered in pharmaceutical science for transdermal delivery of the drug. Nanoemulsions are a type of emulsion with particle sizes ranging from 20 nm to 200 nm. An emulsion is formed by the dispersion of one liquid, usually the oil phase in another immiscible liquid, water phase that is stabilized using the surfactant. Palm kernel oil esters (PKOEs), in comparison to other oils, contain higher amounts of shorter chain esters, which suitable to be applied in micro and nanoemulsion systems as a carrier for actives, with excellent wetting behavior without the oily feeling. This research aimed to study the effect of terpene type and concentration on sodium diclofenac permeation from palm kernel oil esters nanoemulsions and physicochemical properties of the nanoemulsions systems. The effect of various terpenes of geraniol, menthone, menthol, cineol and nerolidol at different concentrations of 0.5, 1.0, 2.0, and 4.0% on permeation of sodium diclofenac were evaluated using Franz diffusion cells and rat skin as permeation membrane. The results of this part demonstrated that all terpenes showed promoting effect on sodium diclofenac penetration. However, menthol and menthone at all concentrations showed significant effects (<0.05) on drug permeation. The most outstanding terpene was menthol with the most significant effect for skin permeability of sodium diclofenac. The effect of terpenes on physicochemical properties of nanoemulsion systems was investigated on the parameters of particle size, zeta potential, pH, viscosity and electrical conductivity. The result showed that all terpenes had the significant effect on particle size and non-significant effects on the zeta potential of the nanoemulsion systems. The effect of terpenes was significant on pH, excluding the menthone at concentrations of 0.5 and 1.0%, and cineol and nerolidol at the concentration of 2.0%. Terpenes also had significant effect on viscosity of nanoemulsions exception of menthone and cineol at the concentration of 0.5%. The result of conductivity measurements showed that all terpenes at all concentration except cineol at the concentration of 0.5% represented significant effect on electrical conductivity.

Keywords: nanoemulsions, palm kernel oil esters, sodium diclofenac, terpenes, skin permeation

Procedia PDF Downloads 421
7541 Technical Assessment of Utilizing Electrical Variable Transmission Systems in Hybrid Electric Vehicles

Authors: Majid Vafaeipour, Mohamed El Baghdadi, Florian Verbelen, Peter Sergeant, Joeri Van Mierlo, Kurt Stockman, Omar Hegazy

Abstract:

The Electrical Variable Transmission (EVT), an electromechanical device, can be considered as an alternative solution to the conventional transmission system utilized in Hybrid Electric Vehicles (HEVs). This study present comparisons in terms of fuel consumption, power split, and state of charge (SoC) of an HEV containing an EVT to a conventional parallel topology and a series topology. To this end, corresponding simulations of these topologies are all performed in presence of control strategies enabling battery charge-sustaining and efficient power split. The power flow through the components of the vehicle are attained, and fuel consumption results of the considered cases are compared. The investigation of the results indicates utilizing EVT can provide significant added values in HEV configurations. The outcome of the current research paves its path for implementation of design optimization approaches on such systems in further research directions.

Keywords: Electrical Variable Transmission (EVT), Hybrid Electric Vehicle (HEV), parallel, series, modeling

Procedia PDF Downloads 238
7540 Materialized View Effect on Query Performance

Authors: Yusuf Ziya Ayık, Ferhat Kahveci

Abstract:

Currently, database management systems have various tools such as backup and maintenance, and also provide statistical information such as resource usage and security. In terms of query performance, this paper covers query optimization, views, indexed tables, pre-computation materialized view, query performance analysis in which query plan alternatives can be created and the least costly one selected to optimize a query. Indexes and views can be created for related table columns. The literature review of this study showed that, in the course of time, despite the growing capabilities of the database management system, only database administrators are aware of the need for dealing with archival and transactional data types differently. These data may be constantly changing data used in everyday life, and also may be from the completed questionnaire whose data input was completed. For both types of data, the database uses its capabilities; but as shown in the findings section, instead of repeating similar heavy calculations which are carrying out same results with the same query over a survey results, using materialized view results can be in a more simple way. In this study, this performance difference was observed quantitatively considering the cost of the query.

Keywords: cost of query, database management systems, materialized view, query performance

Procedia PDF Downloads 280
7539 Elimination Study of Organic Pollutants from Leachate Technical Landfill; Using Fenton and Photo-Fenton Systems Combined with Biological Treatment

Authors: Belahmadi M. S. O., Abdessemed A., Benchiheub M., Doukali H., Kaid Kasbah K. M.

Abstract:

The aim of this study is to evaluate the quality of leachate generated by the Batna landfill site, and to verify the performance of various advanced oxidation processes, in particular the Fenton and Photo-Fenton systems combined with biological treatment to eliminate the recalcitrant organic matter contained in this effluent, and to preserve reverse osmosis membranes used for leachate treatment. The average values obtained are compared with national and international discharge standards. The results of physico-chemical analyses show that the leachate has an alkaline pH =8.26 and a high organic load with a low oxygen content. Mineral pollution is represented by high conductivity (38.3 mS/cm), high Kjeldahl nitrogen content (1266.504 mg/L) and ammoniacal nitrogen (1098.384 mg/L). The average pollution indicator parameters measured were: BOD5 = 1483.333 mg O2 /L, COD = 99790.244 mg O 2/L, TOC = 22400 mg C/L. These parameters exceed Algerian standards. Hence, there is a necessity to treat this effluent before discharging it into the environment. A comparative study was carried out to estimate the efficiency of two oxidation processes. Under optimum reaction conditions, TOC removal efficiencies of 63.43% and 73.4% were achieved for the Fenton and Photo-Fenton processes, respectively. COD removal rates estimated at 88% and 99.5% for the Fenton and Photo- Fenton processes, respectively. In addition, the Photo-Fenton + bacteria + micro- algae hybrid treatment gave removal efficiencies of around 92.24% for TOC and 99.9% for COD; -0.5 for AOS and 0.01 for CN. The results obtained during this study showed that a hybrid approach combining the PhotoFenton process and biological treatment appears to be a highly effective alternative for achieving satisfactory treatment, which aimed at exploiting the advantages of this method in terms of organic pollutant removal.

Keywords: leachate, landfill, advanced oxidation processes, Fenton and Photo-Fenton systems, biological treatment, organic pollutants

Procedia PDF Downloads 67
7538 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals

Authors: Bahareh Ansari

Abstract:

Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.

Keywords: best practices, data visualization, literature review, open government data

Procedia PDF Downloads 106
7537 myITLab as an Implementation Instance of Distance Education Technologies

Authors: Leila Goosen

Abstract:

The research problem reported on in this paper relates to improving success in Computer Science and Information Technology subjects where students are learning applications, especially when teaching occurs in a distance education context. An investigation was launched in order to address students’ struggles with applications, and improve their assessment in such subjects. Some of the main arguments presented centre on formulating and situating significant concepts within an appropriate conceptual framework. The paper explores the experiences and perceptions of computing instructors, teaching assistants, students and higher education institutions on how they are empowered by using technologies such as myITLab. They also share how they are working with the available features to successfully teach applications to their students. The data collection methodology used is then described. The paper includes discussions on how myITLab empowers instructors, teaching assistants, students and higher education institutions. Conclusions are presented on the way in which this paper could make an original and significant contribution to the promotion and development of knowledge in fields related to successfully teaching applications for student learning, including in a distance education context. The paper thus provides a forum for practitioners to highlight and discuss insights and successes, as well as identify new technical and organisational challenges, lessons and concerns regarding practical activities related to myITLab as an implementation instance of distance education technologies.

Keywords: distance, education, myITLab, technologies

Procedia PDF Downloads 359
7536 Domain specific Ontology-Based Knowledge Extraction Using R-GNN and Large Language Models

Authors: Andrey Khalov

Abstract:

The rapid proliferation of unstructured data in IT infrastructure management demands innovative approaches for extracting actionable knowledge. This paper presents a framework for ontology-based knowledge extraction that combines relational graph neural networks (R-GNN) with large language models (LLMs). The proposed method leverages the DOLCE framework as the foundational ontology, extending it with concepts from ITSMO for domain-specific applications in IT service management and outsourcing. A key component of this research is the use of transformer-based models, such as DeBERTa-v3-large, for automatic entity and relationship extraction from unstructured texts. Furthermore, the paper explores how transfer learning techniques can be applied to fine-tune large language models (LLaMA) for using to generate synthetic datasets to improve precision in BERT-based entity recognition and ontology alignment. The resulting IT Ontology (ITO) serves as a comprehensive knowledge base that integrates domain-specific insights from ITIL processes, enabling more efficient decision-making. Experimental results demonstrate significant improvements in knowledge extraction and relationship mapping, offering a cutting-edge solution for enhancing cognitive computing in IT service environments.

Keywords: ontology mapping, R-GNN, knowledge extraction, large language models, NER, knowlege graph

Procedia PDF Downloads 16
7535 Intrinsically Dual-Doped Conductive Polymer System for Electromagnetic Shielding Applications

Authors: S. Koul, Joshua Adedamola

Abstract:

Currently, the global concerning fact about electromagnetic pollution (EMP) is that it not only adversely affects human health but rather projects the malfunctioning of sensitive equipment both locally and at a global level. The market offers many incumbent technologies to solve the issues, but still, a processable sustainable material solution with acceptable limits for GHG emission is still at an exploratory stage. The present work offers a sustainable material solution with a wide range of processability in terms of a polymeric resin matrix and shielding operational efficiency across the electromagnetic spectrum, covering both ionizing and non-ionizing electromagnetic radiations. The present work offers an in-situ synthesized conducting polyaniline (PANI) in the presence of the hybrid dual dopant system with tuned conductivity and high shielding efficiency between 89 to 92 decibels, depending upon the EMI frequency range. The conductive polymer synthesized in the presence of a hybrid dual dopant system via the in-situ emulsion polymerization method offers a higher surface resistance of 1.0 ohms/cm with thermal stability up to 2450C in their powder form. This conductive polymer with a hybrid dual dopant system was used as a filler material with different polymeric thermoplastic resin systems for the preparation of conductive composites. Intrinsically Conductive polymeric (ICP) composites based on hybrid dual dopant systems were prepared using melt blending, extrusion, and finally by, compression molding processing techniques. ICP composites with hybrid dual dopant systems offered good mechanical, thermal, structural, weathering, and stable surface resistivity properties over a period of time. The preliminary shielding behavior for ICP composites between frequency levels of 10 GHz to 24GHZ offered a shielding efficiency of more than 90 dB.

Keywords: ICP, dopant, EMI, shielding

Procedia PDF Downloads 81
7534 Hemocompatible Thin-Film Materials Recreating the Structure of the Cell Niches with High Potential for Endothelialization

Authors: Roman Major, Klaudia Trembecka- Wojciga, Juergen Markus Lackner, Boguslaw Major

Abstract:

The future and the development of science is therefore seen in interdisciplinary areas such as bio medical engineering. Self-assembled structures, similar to stem cell niches would inhibit fast division process and subsequently capture the stem cells from the blood flow. By means of surface topography and the stiffness as well as micro structure progenitor cells should be differentiated towards the formation of endothelial cells monolayer which effectively will inhibit activation of the coagulation cascade. The idea of the material surface development met the interest of the clinical institutions, which support the development of science in this area and are waiting for scientific solutions that could contribute to the development of heart assist systems. This would improve the efficiency of the treatment of patients with myocardial failure, supported with artificial heart assist systems. Innovative materials would enable the redesign, in the post project activity, construction of ventricular heart assist.

Keywords: bio-inspired materials, electron microscopy, haemocompatibility, niche-like structures, thin coatings

Procedia PDF Downloads 478
7533 Adaptive Kaman Filter for Fault Diagnosis of Linear Parameter-Varying Systems

Authors: Rajamani Doraiswami, Lahouari Cheded

Abstract:

Fault diagnosis of Linear Parameter-Varying (LPV) system using an adaptive Kalman filter is proposed. The LPV model is comprised of scheduling parameters, and the emulator parameters. The scheduling parameters are chosen such that they are capable of tracking variations in the system model as a result of changes in the operating regimes. The emulator parameters, on the other hand, simulate variations in the subsystems during the identification phase and have negligible effect during the operational phase. The nominal model and the influence vectors, which are the gradient of the feature vector respect to the emulator parameters, are identified off-line from a number of emulator parameter perturbed experiments. A Kalman filter is designed using the identified nominal model. As the system varies, the Kalman filter model is adapted using the scheduling variables. The residual is employed for fault diagnosis. The proposed scheme is successfully evaluated on simulated system as well as on a physical process control system.

Keywords: identification, linear parameter-varying systems, least-squares estimation, fault diagnosis, Kalman filter, emulators

Procedia PDF Downloads 499
7532 A Novel Software Model for Enhancement of System Performance and Security through an Optimal Placement of PMU and FACTS

Authors: R. Kiran, B. R. Lakshmikantha, R. V. Parimala

Abstract:

Secure operation of power systems requires monitoring of the system operating conditions. Phasor measurement units (PMU) are the device, which uses synchronized signals from the GPS satellites, and provide the phasors information of voltage and currents at a given substation. The optimal locations for the PMUs must be determined, in order to avoid redundant use of PMUs. The objective of this paper is to make system observable by using minimum number of PMUs & the implementation of stability software at 22OkV grid for on-line estimation of the power system transfer capability based on voltage and thermal limitations and for security monitoring. This software utilizes State Estimator (SE) and synchrophasor PMU data sets for determining the power system operational margin under normal and contingency conditions. This software improves security of transmission system by continuously monitoring operational margin expressed in MW or in bus voltage angles, and alarms the operator if the margin violates a pre-defined threshold.

Keywords: state estimator (SE), flexible ac transmission systems (FACTS), optimal location, phasor measurement units (PMU)

Procedia PDF Downloads 410
7531 Smart Lean Manufacturing in the Context of Industry 4.0: A Case Study

Authors: M. Ramadan, B. Salah

Abstract:

This paper introduces a framework to digitalize lean manufacturing tools to enhance smart lean-based manufacturing environments or Lean 4.0 manufacturing systems. The paper discusses the integration between lean tools and the powerful features of recent real-time data capturing systems with the help of Information and Communication Technologies (ICT) to develop an intelligent real-time monitoring and controlling system of production operations concerning lean targets. This integration is represented in the Lean 4.0 system called Dynamic Value Stream Mapping (DVSM). Moreover, the paper introduces the practice of Radio Frequency Identification (RFID) and ICT to smartly support lean tools and practices during daily production runs to keep the lean system alive and effective. This work introduces a practical description of how the lean method tools 5S, standardized work, and poka-yoke can be digitalized and smartly monitored and controlled through DVSM. A framework of the three tools has been discussed and put into practice in a German switchgear manufacturer.

Keywords: lean manufacturing, Industry 4.0, radio frequency identification, value stream mapping

Procedia PDF Downloads 229
7530 Linear Stability Analysis of a Regularized Two-Fluid Model for Unstable Gas-Liquid Flows in Long Hilly Terrain Pipelines

Authors: David Alejandro Lazo-Vasquez, Jorge Luis Balino

Abstract:

In the petroleum industry, multiphase flow occurs when oil, gas, and water are transported in the same pipe through large pipeline systems. The flow can take different patterns depending on parameters like fluid velocities, pipe diameter, pipe inclination, and fluid properties. Mainly, intermittent flow is produced by the natural propagation of short and long waves, according to the Kelvin-Helmholtz Stability Theory. To model stratified flow and the onset of intermittent flow, it is crucial to have knowledge of short and long waves behavior. The two-fluid model, frequently employed for characterizing multiphase systems, becomes ill-posed for high liquid and gas velocities and large inclination angles, for short waves can develop infinite growth rates. We are interested in focusing attention on long-wave instability, which leads to the production of roll waves that may grow and result in the transition from stratified flow to intermittent flow. In this study, global and local linear stability analyses for dynamic and kinematic stability criteria predict the regions of stability of the flow for different pipe inclinations and fluid velocities in regularized and non-regularized systems, concurrently. It was possible to distinguish when: wave growth rates are absolutely bounded (stable stratified smooth flow), waves have finite growth rates (unstable stratified wavy flow), and when the equation system becomes elliptic and hyperbolization is needed. In order to bound short wave growth rates and regularize the equation system, we incorporated some lower and higher-order terms like interfacial drag and surface tension, respectively.

Keywords: linear stability analysis, multiphase flow, onset of slugging, two-fluid model regularization

Procedia PDF Downloads 135
7529 Modelling of Reactive Methodologies in Auto-Scaling Time-Sensitive Services With a MAPE-K Architecture

Authors: Óscar Muñoz Garrigós, José Manuel Bernabeu Aubán

Abstract:

Time-sensitive services are the base of the cloud services industry. Keeping low service saturation is essential for controlling response time. All auto-scalable services make use of reactive auto-scaling. However, reactive auto-scaling has few in-depth studies. This presentation shows a model for reactive auto-scaling methodologies with a MAPE-k architecture. Queuing theory can compute different properties of static services but lacks some parameters related to the transition between models. Our model uses queuing theory parameters to relate the transition between models. It associates MAPE-k related times, the sampling frequency, the cooldown period, the number of requests that an instance can handle per unit of time, the number of incoming requests at a time instant, and a function that describes the acceleration in the service's ability to handle more requests. This model is later used as a solution to horizontally auto-scale time-sensitive services composed of microservices, reevaluating the model’s parameters periodically to allocate resources. The solution requires limiting the acceleration of the growth in the number of incoming requests to keep a constrained response time. Business benefits determine such limits. The solution can add a dynamic number of instances and remains valid under different system sizes. The study includes performance recommendations to improve results according to the incoming load shape and business benefits. The exposed methodology is tested in a simulation. The simulator contains a load generator and a service composed of two microservices, where the frontend microservice depends on a backend microservice with a 1:1 request relation ratio. A common request takes 2.3 seconds to be computed by the service and is discarded if it takes more than 7 seconds. Both microservices contain a load balancer that assigns requests to the less loaded instance and preemptively discards requests if they are not finished in time to prevent resource saturation. When load decreases, instances with lower load are kept in the backlog where no more requests are assigned. If the load grows and an instance in the backlog is required, it returns to the running state, but if it finishes the computation of all requests and is no longer required, it is permanently deallocated. A few load patterns are required to represent the worst-case scenario for reactive systems: the following scenarios test response times, resource consumption and business costs. The first scenario is a burst-load scenario. All methodologies will discard requests if the rapidness of the burst is high enough. This scenario focuses on the number of discarded requests and the variance of the response time. The second scenario contains sudden load drops followed by bursts to observe how the methodology behaves when releasing resources that are lately required. The third scenario contains diverse growth accelerations in the number of incoming requests to observe how approaches that add a different number of instances can handle the load with less business cost. The exposed methodology is compared against a multiple threshold CPU methodology allocating/deallocating 10 or 20 instances, outperforming the competitor in all studied metrics.

Keywords: reactive auto-scaling, auto-scaling, microservices, cloud computing

Procedia PDF Downloads 93
7528 Experiences of Timing Analysis of Parallel Embedded Software

Authors: Muhammad Waqar Aziz, Syed Abdul Baqi Shah

Abstract:

The execution time analysis is fundamental to the successful design and execution of real-time embedded software. In such analysis, the Worst-Case Execution Time (WCET) of a program is a key measure, on the basis of which system tasks are scheduled. The WCET analysis of embedded software is also needed for system understanding and to guarantee its behavior. WCET analysis can be performed statically (without executing the program) or dynamically (through measurement). Traditionally, research on the WCET analysis assumes sequential code running on single-core platforms. However, as computation is steadily moving towards using a combination of parallel programs and multi-core hardware, new challenges in WCET analysis need to be addressed. In this article, we report our experiences of performing the WCET analysis of Parallel Embedded Software (PES) running on multi-core platform. The primary purpose was to investigate how WCET estimates of PES can be computed statically, and how they can be derived dynamically. Our experiences, as reported in this article, include the challenges we faced, possible suggestions to these challenges and the workarounds that were developed. This article also provides observations on the benefits and drawbacks of deriving the WCET estimates using the said methods and provides useful recommendations for further research in this area.

Keywords: embedded software, worst-case execution-time analysis, static flow analysis, measurement-based analysis, parallel computing

Procedia PDF Downloads 324
7527 Impact of Task Technology Fit on User Effectiveness, Efficiency and Creativity in Iranian Pharmaceutical Oraganizations

Authors: Milad Keshvardoost, Amir Khanlari, Nader Khalesi

Abstract:

Background: Any firm in the pharmaceutical industry requires efficient and effective management information systems (MIS) to support managerial functions. Purpose: The aim of this study is to investigate the impact of Task-Technology Fit on user effectiveness, efficiency, and creativity in Iranian pharmaceutical companies. Methodology: 345 reliable and validate questionnaires were distributed among selected samples, through the cluster method, to Information system users of eight leading Iranian pharmaceutical companies, based on the likert scale. The proposed model of the article is based on a model with Task technology fit, on user performance with the definition of efficiency, effectiveness, and creativity through mediation effects of perceived usefulness and ease of use. Results: This study confirmed that TTF with definitions of adequacy and compatibility has positive impacts on user performance Conclusion: We concluded that pharmaceutical users of IS, utilizing a system with a precise and intense observation of users' demands, may make facilitation for them to design an exclusive IS framework.

Keywords: information systems, user performance, pharmaceuticals, task technology fit

Procedia PDF Downloads 171
7526 Fully Autonomous Vertical Farm to Increase Crop Production

Authors: Simone Cinquemani, Lorenzo Mantovani, Aleksander Dabek

Abstract:

New technologies in agriculture are opening new challenges and new opportunities. Among these, certainly, robotics, vision, and artificial intelligence are the ones that will make a significant leap, compared to traditional agricultural techniques, possible. In particular, the indoor farming sector will be the one that will benefit the most from these solutions. Vertical farming is a new field of research where mechanical engineering can bring knowledge and know-how to transform a highly labor-based business into a fully autonomous system. The aim of the research is to develop a multi-purpose, modular, and perfectly integrated platform for crop production in indoor vertical farming. Activities will be based both on hardware development such as automatic tools to perform different activities on soil and plants, as well as research to introduce an extensive use of monitoring techniques based on machine learning algorithms. This paper presents the preliminary results of a research project of a vertical farm living lab designed to (i) develop and test vertical farming cultivation practices, (ii) introduce a very high degree of mechanization and automation that makes all processes replicable, fully measurable, standardized and automated, (iii) develop a coordinated control and management environment for autonomous multiplatform or tele-operated robots in environments with the aim of carrying out complex tasks in the presence of environmental and cultivation constraints, (iv) integrate AI-based algorithms as decision support system to improve quality production. The coordinated management of multiplatform systems still presents innumerable challenges that require a strongly multidisciplinary approach right from the design, development, and implementation phases. The methodology is based on (i) the development of models capable of describing the dynamics of the various platforms and their interactions, (ii) the integrated design of mechatronic systems able to respond to the needs of the context and to exploit the strength characteristics highlighted by the models, (iii) implementation and experimental tests performed to test the real effectiveness of the systems created, evaluate any weaknesses so as to proceed with a targeted development. To these aims, a fully automated laboratory for growing plants in vertical farming has been developed and tested. The living lab makes extensive use of sensors to determine the overall state of the structure, crops, and systems used. The possibility of having specific measurements for each element involved in the cultivation process makes it possible to evaluate the effects of each variable of interest and allows for the creation of a robust model of the system as a whole. The automation of the laboratory is completed with the use of robots to carry out all the necessary operations, from sowing to handling to harvesting. These systems work synergistically thanks to the knowledge of detailed models developed based on the information collected, which allows for deepening the knowledge of these types of crops and guarantees the possibility of tracing every action performed on each single plant. To this end, artificial intelligence algorithms have been developed to allow synergistic operation of all systems.

Keywords: automation, vertical farming, robot, artificial intelligence, vision, control

Procedia PDF Downloads 39
7525 The Role of Knowledge Management in Global Software Engineering

Authors: Samina Khalid, Tehmina Khalil, Smeea Arshad

Abstract:

Knowledge management is essential ingredient of successful coordination in globally distributed software engineering. Various frameworks, KMSs, and tools have been proposed to foster coordination and communication between virtual teams but practical implementation of these solutions has not been found. Organizations have to face challenges to implement knowledge management system. For this purpose at first, a literature review is arranged to investigate about challenges that restrict organizations to implement KMS and then by taking in account these challenges a problem of need of integrated solution in the form of standardized KMS that can easily store tacit and explicit knowledge, has traced down to facilitate coordination and collaboration among virtual teams. Literature review has been already shown that knowledge is a complex perception with profound meanings, and one of the most important resources that contributes to the competitive advantage of an organization. In order to meet the different challenges caused by not properly managing knowledge related to projects among virtual teams in GSE, we suggest making use of the cloud computing model. In this research a distributed architecture to support KM storage is proposed called conceptual framework of KM as a service in cloud. Framework presented is enhanced and conceptual framework of KM is embedded into that framework to store projects related knowledge for future use.

Keywords: management, Globsl software development, global software engineering

Procedia PDF Downloads 527
7524 Design of Electric Ship Charging Station Considering Renewable Energy and Storage Systems

Authors: Jun Yuan

Abstract:

Shipping is a major transportation mode all over the world, and it has a significant contribution to global carbon emissions. Electrification of ships is one of the main strategies to reduce shipping carbon emissions. The number of electric ships has continued to grow in recent years. However, charging infrastructure is still scarce, which severely restricts the development of electric ships. Therefore, it is very important to design ship charging stations reasonably by comprehensively considering charging demand and investment costs. This study aims to minimize the full life cycle cost of charging stations, considering the uncertainty of charging demand. A mixed integer programming model is developed for this optimization problem. Based on the characteristics of the mathematical model, a simulation based optimization method is proposed to find the optimal number and rated power of chargers. In addition, the impact of renewable energy and storage systems is analyzed. The results can provide decision support and a reference basis for the design of ship charging stations.

Keywords: shipping emission, electricity ship, charging station, optimal design

Procedia PDF Downloads 62
7523 Integrated Decision Support for Energy/Water Planning in Zayandeh Rud River Basin in Iran

Authors: Safieh Javadinejad

Abstract:

In order to make well-informed decisions respecting long-term system planning, resource managers and policy creators necessitate to comprehend the interconnections among energy and water utilization and manufacture—and also the energy-water nexus. Planning and assessment issues contain the enhancement of strategies for declining the water and energy system’s vulnerabilities to climate alteration with also emissions of decreasing greenhouse gas. In order to deliver beneficial decision support for climate adjustment policy and planning, understanding the regionally-specific features of the energy-water nexus, and the history-future of the water and energy source systems serving is essential. It will be helpful for decision makers understand the nature of current water-energy system conditions and capacity for adaptation plans for future. This research shows an integrated hydrology/energy modeling platform which is able to extend water-energy examines based on a detailed illustration of local circumstances. The modeling links the Water Evaluation and Planning (WEAP) and the Long Range Energy Alternatives Planning (LEAP) system to create full picture of water-energy processes. This will allow water managers and policy-decision makers to simply understand links between energy system improvements and hydrological processing and realize how future climate change will effect on water-energy systems. The Zayandeh Rud river basin in Iran is selected as a case study to show the results and application of the analysis. This region is known as an area with large integration of both the electric power and water sectors. The linkages between water, energy and climate change and possible adaptation strategies are described along with early insights from applications of the integration modeling system.

Keywords: climate impacts, hydrology, water systems, adaptation planning, electricity, integrated modeling

Procedia PDF Downloads 292
7522 Are SMS Reminders an Precursor to Outpatient Show-Ups?

Authors: Shankar M. Bakkannavar, Smitha Nayak, Vinod C. Nayak, Ravi Bagali

Abstract:

Attendance rate for hospital outpatient appointments plays a pivotal role in operational efficiency of a hospital. Strategic interventions like ‘reminder systems’ prior to the scheduled appointment has proved to be an effective strategy for outpatient appointment ‘show-ups’. This study is designed with an objective to assess the effectiveness of SMS reminders as an intervention to enhance the effectiveness of hospital outpatient attendance. Method: The survey was conducted at Columbia Asia Hosiptal, Bangalore. We surveyed 60 patients who had a scheduled outpatient appointment in Department of General Medicine, Department of Obstetrics and Gynecology and the Orthopedics department, as these departments had a heavy patient flow and had higher contributions to the top line of the hospital. Results: Majority (64%) of the patients preferred to be sent an SMS reminder on the outpatient appointment schedule. 37 (61%) respondents stated that the ideally, reminders could be effective only if they are sent 24-48 hours prior to the appointment schedule. 41(68%) respondents were of the opinion that a minimum of two reminders would be necessary to ensure patients show up for the appointment. 1% level of significance. It also observed that there is strong association between age and preference on mode of reminder (P=0.002).

Keywords: reminder systems, appointment show-ups, SMS reminders, health Information

Procedia PDF Downloads 354
7521 Cloud Support for Scientific Workflow Execution: Prototyping Solutions for Remote Sensing Applications

Authors: Sofiane Bendoukha, Daniel Moldt, Hayat Bendoukha

Abstract:

Workflow concepts are essential for the development of remote sensing applications. They can help users to manage and process satellite data and execute scientific experiments on distributed resources. The objective of this paper is to introduce an approach for the specification and the execution of complex scientific workflows in Cloud-like environments. The approach strives to support scientists during the modeling, the deployment and the monitoring of their workflows. This work takes advantage from Petri nets and more pointedly the so-called reference nets formalism, which provides a robust modeling/implementation technique. RENEWGRASS is a tool that we implemented and integrated into the Petri nets editor and simulator RENEW. It provides an easy way to support not experienced scientists during the specification of their workflows. It allows both modeling and enactment of image processing workflows from the remote sensing domain. Our case study is related to the implementation of vegetation indecies. We have implemented the Normalized Differences Vegetation Index (NDVI) workflow. Additionally, we explore the integration possibilities of the Cloud technology as a supplementary layer for the deployment of the current implementation. For this purpose, we discuss migration patterns of data and applications and propose an architecture.

Keywords: cloud computing, scientific workflows, petri nets, RENEWGRASS

Procedia PDF Downloads 447
7520 Photon Blockade in Non-Hermitian Optomechanical Systems with Nonreciprocal Couplings

Authors: J. Y. Sun, H. Z. Shen

Abstract:

We study the photon blockade at exceptional points for a non-Hermitian optomechanical system coupled to the driven whispering-gallery-mode microresonator with two nanoparticles under the weak optomechanical coupling approximation, where exceptional points emerge periodically by controlling the relative angle of the nanoparticles. We find that conventional photon blockade occurs at exceptional points for the eigenenergy resonance of the single-excitation subspace driven by a laser field and discuss the physical origin of conventional photon blockade. Under the weak driving condition, we analyze the influences of the different parameters on conventional photon blockade. We investigate conventional photon blockade at nonexceptional points, which exists at two optimal detunings due to the eigenstates in the single-excitation subspace splitting from one (coalescence) at exceptional points to two at nonexceptional points. Unconventional photon blockade can occur at nonexceptional points, while it does not exist at exceptional points since the destructive quantum interference cannot occur due to the two different quantum pathways to the two-photon state not being formed. The realization of photon blockade in our proposal provides a viable and flexible way for the preparation of single-photon sources in the non-Hermitian optomechanical system.

Keywords: optomechanical systems, photon blockade, non-hermitian, exceptional points

Procedia PDF Downloads 140
7519 Axiomatic Design and Organization Design: Opportunities and Challenges in Transferring Axiomatic Design to the Social Sciences

Authors: Nicolay Worren, Christopher A. Brown

Abstract:

Axiomatic design (AD) has mainly been applied to support the design of physical products and software solutions. However, it was intended as a general design approach that would also be applicable to the design of social systems, including organizations (i.e., organization design). In this article, we consider how AD may be successfully transferred to the field of organizational design. On the one hand, it provides a much-needed pragmatic approach that can help leaders clarify the link between the purpose and structure of their organizations, identify ineffective organizational structures, and increase the chance of achieving strategic goals. On the other hand, there are four conceptual challenges that may create uncertainty and resistance among scholars and practitioners educated in the social sciences: 1) The exclusive focus in AD on negative interdependencies ('coupling'); 2) No obvious way of representing the need for integration across design parameters (DPs); 3) A lack of principles for handling control processes that seem to require 'deliberate coupling' of FRs; and 4) A lack of principles for handling situations where conflicting FRs (i.e., coupling) might require integration rather than separation. We discuss alternative options for handling these challenges so that scholars and practitioners can make use of AD for organization design.

Keywords: axiomatic design, organization design, social systems, concept definitions

Procedia PDF Downloads 126
7518 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: confidence interval, handwriting, kernel density estimator, KDE, logistic regression LoR, repeatability, reproducibility

Procedia PDF Downloads 124
7517 Context-Aware Recommender Systems Using User's Emotional State

Authors: Hoyeon Park, Kyoung-jae Kim

Abstract:

The product recommendation is a field of research that has received much attention in the recent information overload phenomenon. The proliferation of the mobile environment and social media cannot help but affect the results of the recommendation depending on how the factors of the user's situation are reflected in the recommendation process. Recently, research has been spreading attention to the context-aware recommender system which is to reflect user's contextual information in the recommendation process. However, until now, most of the context-aware recommender system researches have been limited in that they reflect the passive context of users. It is expected that the user will be able to express his/her contextual information through his/her active behavior and the importance of the context-aware recommender system reflecting this information can be increased. The purpose of this study is to propose a context-aware recommender system that can reflect the user's emotional state as an active context information to recommendation process. The context-aware recommender system is a recommender system that can make more sophisticated recommendations by utilizing the user's contextual information and has an advantage that the user's emotional factor can be considered as compared with the existing recommender systems. In this study, we propose a method to infer the user's emotional state, which is one of the user's context information, by using the user's facial expression data and to reflect it on the recommendation process. This study collects the facial expression data of a user who is looking at a specific product and the user's product preference score. Then, we classify the facial expression data into several categories according to the previous research and construct a model that can predict them. Next, the predicted results are applied to existing collaborative filtering with contextual information. As a result of the study, it was shown that the recommended results of the context-aware recommender system including facial expression information show improved results in terms of recommendation performance. Based on the results of this study, it is expected that future research will be conducted on recommender system reflecting various contextual information.

Keywords: context-aware, emotional state, recommender systems, business analytics

Procedia PDF Downloads 229