Search results for: rule based systems
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 34113

Search results for: rule based systems

31563 Simplifying the Migration of Architectures in Embedded Applications Introducing a Pattern Language to Support the Workforce

Authors: Farha Lakhani, Michael J. Pont

Abstract:

There are two main architectures used to develop software for modern embedded systems: these can be labelled as “event-triggered” (ET) and “time-triggered” (TT). The research presented in this paper is concerned with the issues involved in migration between these two architectures. Although TT architectures are widely used in safety-critical applications they are less familiar to developers of mainstream embedded systems. The research presented in this paper began from the premise that–for a broad class of systems that have been implemented using an ET architecture–migration to a TT architecture would improve reliability. It may be tempting to assume that conversion between ET and TT designs will simply involve converting all event-handling software routines into periodic activities. However, the required changes to the software architecture are, in many cases rather more profound. The main contribution of the work presented in this paper is to identify ways in which the significant effort involved in migrating between existing ET architectures and “equivalent” (and effective) TT architectures could be reduced. The research described in this paper has taken an innovative step in this regard by introducing the use of ‘Design patterns’ for this purpose for the first time.

Keywords: embedded applications, software architectures, reliability, pattern

Procedia PDF Downloads 330
31562 Measurement of Viscosity and Moisture of Oil in Supradistribution Transformers Using Ultrasonic Waves

Authors: Ehsan Kadkhodaie, Shahin Parvar, Soroush Senemar, Mostafa Shriat, Abdolrasoul Malekpour

Abstract:

The role of oil in supra distribution transformers is so critical and, several standards in determining the quality of oil have been offered. So far, moisture, viscosity and insulation protection of the oil have been measured based on mechanical and chemical methods and systems such as kart fisher, falling ball and TDM 4000 that most of these techniques are destructive and have many problems such as pollution. In this study, due to the properties of oil and also physical behavior of ultrasound wave new method was designed to in the determination of oil indicators including viscosity and moisture. The results show the oil viscosity can be found from the relationship μ = 42.086/√EE and moisture from (PLUS+) = −15.65 (PPM) + 26040 relationship.

Keywords: oil, viscosity, moisture, ultrasonic waves

Procedia PDF Downloads 585
31561 Visibility Measurements Using a Novel Open-Path Optical Extinction Analyzer

Authors: Nabil Saad, David Morgan, Manish Gupta

Abstract:

Visibility has become a key component of air quality and is regulated in many areas by environmental laws such as the EPA Clean Air Act and Regional Haze Rule. Typically, visibility is calculated by estimating the optical absorption and scattering of both gases and aerosols. A major component of the aerosols’ climatic effect is due to their scattering and absorption of solar radiation, which are governed by their optical and physical properties. However, the accurate assessment of this effect on global warming, climate change, and air quality is made difficult due to uncertainties in the calculation of single scattering albedo (SSA). Experimental complications arise in the determination of the single scattering albedo of an aerosol particle since it requires the simultaneous measurement of both scattering and extinction. In fact, aerosol optical absorption, in particular, is a difficult measurement to perform, and it’s often associated with large uncertainties when using filter methods or difference methods. In this presentation, we demonstrate the use of a new open-path Optical Extinction Analyzer (OEA) in conjunction with a nephelometer and two particle sizers, emphasizing the benefits that co-employment of the OEA offers to derive the complex refractive index of aerosols and their single scattering albedo parameter. Various use cases, data reproducibility, and instrument calibration will also be presented to highlight the value proposition of this novel Open-Path OEA.

Keywords: aerosols, extinction, visibility, albedo

Procedia PDF Downloads 96
31560 Reactive Power Control Strategy for Z-Source Inverter Based Reconfigurable Photovoltaic Microgrid Architectures

Authors: Reshan Perera, Sarith Munasinghe, Himali Lakshika, Yasith Perera, Hasitha Walakadawattage, Udayanga Hemapala

Abstract:

This research presents a reconfigurable architecture for residential microgrid systems utilizing Z-Source Inverter (ZSI) to optimize solar photovoltaic (SPV) system utilization and enhance grid resilience. The proposed system addresses challenges associated with high solar power penetration through various modes, including current control, voltage-frequency control, and reactive power control. It ensures uninterrupted power supply during grid faults, providing flexibility and reliability for grid-connected SPV customers. Challenges and opportunities in reactive power control for microgrids are explored, with simulation results and case studies validating proposed strategies. From a control and power perspective, the ZSI-based inverter enhances safety, reduces failures, and improves power quality compared to traditional inverters. Operating seamlessly in grid-connected and islanded modes guarantees continuous power supply during grid disturbances. Moreover, the research addresses power quality issues in long distribution feeders during off-peak and night-peak hours or fault conditions. Using the Distributed Static Synchronous Compensator (DSTATCOM) for voltage stability, the control objective is nighttime voltage regulation at the Point of Common Coupling (PCC). In this mode, disconnection of PV panels, batteries, and the battery controller allows the ZSI to operate in voltage-regulating mode, with critical loads remaining connected. The study introduces a structured controller for Reactive Power Controlling mode, contributing to a comprehensive and adaptable solution for residential microgrid systems. Mathematical modeling and simulations confirm successful maximum power extraction, controlled voltage, and smooth voltage-frequency regulation.

Keywords: reconfigurable architecture, solar photovoltaic, microgrids, z-source inverter, STATCOM, power quality, battery storage system

Procedia PDF Downloads 22
31559 Toward a Measure of Appropriateness of User Interfaces Adaptations Solutions

Authors: Abderrahim Siam, Ramdane Maamri, Zaidi Sahnoun

Abstract:

The development of adaptive user interfaces (UI) presents for a long time an important research area in which researcher attempt to call upon the full resources and skills of several disciplines. The adaptive UI community holds a thorough knowledge regarding the adaptation of UIs with users and with contexts of use. Several solutions, models, formalisms, techniques, and mechanisms were proposed to develop adaptive UI. In this paper, we propose an approach based on the fuzzy set theory for modeling the concept of the appropriateness of different solutions of UI adaptation with different situations for which interactive systems have to adapt their UIs.

Keywords: adaptive user interfaces, adaptation solution’s appropriateness, fuzzy sets

Procedia PDF Downloads 494
31558 Exploring the Neural Correlates of Different Interaction Types: A Hyperscanning Investigation Using the Pattern Game

Authors: Beata Spilakova, Daniel J. Shaw, Radek Marecek, Milan Brazdil

Abstract:

Hyperscanning affords a unique insight into the brain dynamics underlying human interaction by simultaneously scanning two or more individuals’ brain responses while they engage in dyadic exchange. This provides an opportunity to observe dynamic brain activations in all individuals participating in interaction, and possible interbrain effects among them. The present research aims to provide an experimental paradigm for hyperscanning research capable of delineating among different forms of interaction. Specifically, the goal was to distinguish between two dimensions: (1) interaction structure (concurrent vs. turn-based) and (2) goal structure (competition vs cooperation). Dual-fMRI was used to scan 22 pairs of participants - each pair matched on gender, age, education and handedness - as they played the Pattern Game. In this simple interactive task, one player attempts to recreate a pattern of tokens while the second player must either help (cooperation) or prevent the first achieving the pattern (competition). Each pair played the game iteratively, alternating their roles every round. The game was played in two consecutive sessions: first the players took sequential turns (turn-based), but in the second session they placed their tokens concurrently (concurrent). Conventional general linear model (GLM) analyses revealed activations throughout a diffuse collection of brain regions: The cooperative condition engaged medial prefrontal cortex (mPFC) and posterior cingulate cortex (PCC); in the competitive condition, significant activations were observed in frontal and prefrontal areas, insula cortices and the thalamus. Comparisons between the turn-based and concurrent conditions revealed greater precuneus engagement in the former. Interestingly, mPFC, PCC and insulae are linked repeatedly to social cognitive processes. Similarly, the thalamus is often associated with a cognitive empathy, thus its activation may reflect the need to predict the opponent’s upcoming moves. Frontal and prefrontal activation most likely represent the higher attentional and executive demands of the concurrent condition, whereby subjects must simultaneously observe their co-player and place his own tokens accordingly. The activation of precuneus in the turn-based condition may be linked to self-other distinction processes. Finally, by performing intra-pair correlations of brain responses we demonstrate condition-specific patterns of brain-to-brain coupling in mPFC and PCC. Moreover, the degree of synchronicity in these neural signals related to performance on the game. The present results, then, show that different types of interaction recruit different brain systems implicated in social cognition, and the degree of inter-player synchrony within these brain systems is related to nature of the social interaction.

Keywords: brain-to-brain coupling, hyperscanning, pattern game, social interaction

Procedia PDF Downloads 344
31557 Online Bakery Management System Proposal

Authors: Alexander Musyoki, Collins Odour

Abstract:

Over the past few years, the bakery industry in Kenya has experienced significant growth largely in part to the increased adoption of technology and automation in their processes; more specifically due to the adoption of bakery management systems to help in running bakeries. While they have been largely responsible for the improved productivity and efficiency in bakeries, most of them are now outdated and pose more challenges than benefits. The proposed online bakery management system mentioned in this paper aims to address this by allowing bakery owners to track inventory, budget, job progress, and data analytics on each job and in doing so, promote the Sustainable Development Goals 3 and 12, which aim to ensure healthy lives and promote sustainable economic growth as the proposed benefits of these features include scalability, easy accessibility, reduced acquisition costs, better reliability, and improved functionality that will allow bakeries to become more competitive, reduce waste and track inventory more efficiently. To better understand the challenges, a comprehensive study has been performed to assess these traditional systems and try to understand if an online bakery management system can prove to be advantageous to bakery owners. The study conducted gathered feedback from bakery owners and employees in Nairobi County, Kenya using an online survey with a response rate of about 86% from the target population. The responses cited complex and hard to use bakery management systems (59.7%), lack of portability from one device to the other (58.1%) and high acquisition costs (51.6%) as the top challenges of traditional bakery management systems. On the other hand, some of the top benefits that most of the respondents would realize from the online bakery management system was better reliability (58.1%) and reduced acquisition costs (58.1%). Overall, the findings suggest that an online bakery management system has a lot of advantages over traditional systems and is likely to be well-received in the market. In conclusion, the proposed online bakery management system has the potential to improve the efficiency and competitiveness of small-sized bakeries in Nairobi County. Further research is recommended to expand the sample size and diversity of respondents and to conduct more in-depth analyses of the data collected.

Keywords: ICT, technology and automation, bakery management systems, food innovation

Procedia PDF Downloads 86
31556 Dynamic Communications Mapping in NoC-Based Heterogeneous MPSoCs

Authors: M. K. Benhaoua, A. K. Singh, A. E. H. Benyamina

Abstract:

In this paper, we propose heuristic for dynamic communications mapping that considers the placement of communications in order to optimize the overall performance. The mapping technique uses a newly proposed Algorithm to place communications between the tasks. The placement we propose of the communications leads to a better optimization of several performance metrics (time and energy consumption). Experimental results show that the proposed mapping approach provides significant performance improvements when compared to those using static routing.

Keywords: Multi-Processor Systems-on-Chip (MPSoCs), Network-on-Chip (NoC), heterogeneous architectures, dynamic mapping heuristics

Procedia PDF Downloads 540
31555 Influence of Security Attributes in Component-Based Software Development

Authors: Somayeh Zeinali

Abstract:

A component is generally defined as a piece of executable software with a published interface. Component-based software engineering (CBSE) has become recognized as a new sub-discipline of software engineering. In the component-based software development, components cannot be completely secure and thus easily become vulnerable. Some researchers have investigated this issue and proposed approaches to detect component intrusions or protect distributed components. Software security also refers to the process of creating software that is considered secure.The terms “dependability”, “trustworthiness”, and “survivability” are used interchangeably to describe the properties of software security.

Keywords: component-based software development, component-based software engineering , software security attributes, dependability, component

Procedia PDF Downloads 561
31554 A Compact Quasi-Zero Stiffness Vibration Isolator Using Flexure-Based Spring Mechanisms Capable of Tunable Stiffness

Authors: Thanh-Phong Dao, Shyh-Chour Huang

Abstract:

This study presents a quasi-zero stiffness (QZS) vibration isolator using flexure-based spring mechanisms which afford both negative and positive stiffness elements, which enable self-adjustment. The QZS property of the isolator is achieved at the equilibrium position. A nonlinear mathematical model is then developed, based on the pre-compression of the flexure-based spring mechanisms. The dynamics are further analyzed using the Harmonic Balance method. The vibration attention efficiency is illustrated using displacement transmissibility, which is then compared with the corresponding linear isolator. The effects of parameters on performance are also investigated by numerical solutions. The flexure-based spring mechanisms are subsequently designed using the concept of compliant mechanisms, with evaluation by ANSYS software, and simulations of the QZS isolator.

Keywords: vibration isolator, quasi-zero stiffness, flexure-based spring mechanisms, compliant mechanism

Procedia PDF Downloads 468
31553 Rethinking Nigeria's Foreign Policy in the Age of Global Terrorism

Authors: Shuaibu Umar Abdul

Abstract:

This paper examines Nigeria’s foreign policy in the age of global terrorism. It worth saying that the threat of ‘terrorism’ is not peculiar to Western and Middle Eastern countries alone, its tentacles are now spreading all over, Africa inclusive. The issue of domestic terrorism in Nigeria has become pervasive since the return of democratic rule in 1999. This development has never been a witness in any form throughout the year of statehood in Nigeria, the issues of banditry, armed robbery, ritual killing, and criminal activities like kidnapping and pipeline vandalization, the breakdown of law and order, poorly managed infrastructural facilities and corruption remain synonymous to Nigeria. These acts of terrorism no doubt have constituted a challenge that necessitates the paradigm shift in Nigeria’s foreign policy. The study employed the conceptual framework of analysis to lead interrogation; secondary sources were used to generate data while descriptive and content analysis were considered for data presentation and interpretation. In view of the interrogation and discussion on the subject matter, the paper revealed that Nigerian government underrated and underestimated the strength of terrorism within and outside her policy hence, it becomes difficult to address. As a response to the findings and conclusion of the study, the paper recommends among others that Nigeria’s foreign policy has to be rethought, reshaped and remodeled in cognizance to the rising global terrorism for peace, growth and development in the country.

Keywords: foreign policy, globe, Nigeria, rethinking, terrorism

Procedia PDF Downloads 363
31552 Human Security as a Tool of Protecting International Human Rights Law

Authors: Arenca Trashani

Abstract:

20 years after its first entrance in a General Assembly of the United Nation’s Resolution, human security has became a very important tool in a global debate affecting directly the whole main rules and regulations in international law and more closely in international human rights law. This paper will cover a very important issue of today at how the human security has its impact to the development of international human rights law, not as far as a challenge as it is seen up now but a tool of moving toward development and globalization. In order to analyze the impact of human security to the global agenda, we need to look to the main pillars of the international legal order which are affected by the human security in itself and its application in the policy making for this international legal order global and regional ones. This paper will focus, also, on human security, as a new and very important tool of measuring development, stability and the level of democratic consolidation and the respect for human rights especially in developing countries such as Albania. The states are no longer capable to monopolize the use of human security just within their boundaries and separated from the other principles of a functioning democracy. In this context, human security would be best guaranteed under the respect of the rule of law and democratization. During the last two decades the concept security has broadly developed, from a state-centric to a more human-centric approach: from state security to respect for human rights, to economic security, to environmental security as well. Last but not least we would see that human rights could be affected by human security not just at their promotion but also at their enforcement and mainly at the international institutions, which are entitled to promote and to protect human rights.

Keywords: human security, international human rights law, development, Albania, international law

Procedia PDF Downloads 761
31551 Symbolic Partial Differential Equations Analysis Using Mathematica

Authors: Davit Shahnazaryan, Diogo Gomes, Mher Safaryan

Abstract:

Many symbolic computations and manipulations required in the analysis of partial differential equations (PDE) or systems of PDEs are tedious and error-prone. These computations arise when determining conservation laws, entropies or integral identities, which are essential tools for the study of PDEs. Here, we discuss a new Mathematica package for the symbolic analysis of PDEs that automate multiple tasks, saving time and effort. Methodologies: During the research, we have used concepts of linear algebra and partial differential equations. We have been working on creating algorithms based on theoretical mathematics to find results mentioned below. Major Findings: Our package provides the following functionalities; finding symmetry group of different PDE systems, generation of polynomials invariant with respect to different symmetry groups; simplification of integral quantities by integration by parts and null Lagrangian cleaning, computing general forms of expressions by integration by parts; finding equivalent forms of an integral expression that are simpler or more symmetric form; determining necessary and sufficient conditions on the coefficients for the positivity of a given symbolic expression. Conclusion: Using this package, we can simplify integral identities, find conserved and dissipated quantities of time-dependent PDE or system of PDEs. Some examples in the theory of mean-field games and semiconductor equations are discussed.

Keywords: partial differential equations, symbolic computation, conserved and dissipated quantities, mathematica

Procedia PDF Downloads 165
31550 Concepts in the Design of Lateral-Load Systems in High Rise Buildings to Reduce Operational Energy Consumption

Authors: Mohamed Ali MiladKrem Salem, Sergio F.Breña, Sanjay R. Arwade, Simi T. Hoque

Abstract:

The location of the main lateral‐load resisting system in high-rise buildings may have positive impacts on sustainability through a reduction in operational energy consumption, and this paper describes an assessment of the accompanying effects on structural performance. It is found that there is a strong influence of design for environmental performance on the structural performance the building, and that systems selected primarily with an eye towards energy use reduction may require substantial additional structural stiffening to meet safety and serviceability limits under lateral load cases. We present a framework for incorporating the environmental costs of meeting structural design requirements through the embodied energy of the core structural materials and also address the issue of economic cost brought on by incorporation of environmental concerns into the selection of the structural system. We address these issues through four case study high-rise buildings with differing structural morphologies (floor plan and core arrangement) and assess each of these building models for cost and embodied energy when the base structural system, which has been suggested by architect Kenneth Yeang based on environmental concerns, is augmented to meet lateral drift requirements under the wind loads prescribed by ASCE 7-10.

Keywords: sustainable, embodied, Outrigger, skyscraper, morphology, efficiency

Procedia PDF Downloads 478
31549 Lamb Wave-Based Blood Coagulation Measurement System Using Citrated Plasma

Authors: Hyunjoo Choi, Jeonghun Nam, Chae Seung Lim

Abstract:

Acoustomicrofluidics has gained much attention due to the advantages, such as noninvasiveness and easy integration with other miniaturized systems, for clinical and biological applications. However, a limitation of acoustomicrofluidics is the complicated and costly fabrication process of electrodes. In this study, we propose a low-cost and lithography-free device using Lamb wave for blood analysis. Using a Lamb wave, calcium ion-removed blood plasma and coagulation reagents can be rapidly mixed for blood coagulation test. Due to the coagulation process, the viscosity of the sample increases and the viscosity change can be monitored by internal acoustic streaming of microparticles suspended in the sample droplet. When the acoustic streaming of particles stops by the viscosity increase is defined as the coagulation time. With the addition of calcium ion at 0-25 mM, the coagulation time was measured and compared with the conventional index for blood coagulation analysis, prothrombin time, which showed highly correlated with the correlation coefficient as 0.94. Therefore, our simple and cost-effective Lamb wave-based blood analysis device has the powerful potential to be utilized in clinical settings.

Keywords: acoustomicrofluidics, blood analysis, coagulation, lamb wave

Procedia PDF Downloads 343
31548 Model Order Reduction of Continuous LTI Large Descriptor System Using LRCF-ADI and Square Root Balanced Truncation

Authors: Mohammad Sahadet Hossain, Shamsil Arifeen, Mehrab Hossian Likhon

Abstract:

In this paper, we analyze a linear time invariant (LTI) descriptor system of large dimension. Since these systems are difficult to simulate, compute and store, we attempt to reduce this large system using Low Rank Cholesky Factorized Alternating Directions Implicit (LRCF-ADI) iteration followed by Square Root Balanced Truncation. LRCF-ADI solves the dual Lyapunov equations of the large system and gives low-rank Cholesky factors of the gramians as the solution. Using these cholesky factors, we compute the Hankel singular values via singular value decomposition. Later, implementing square root balanced truncation, the reduced system is obtained. The bode plots of original and lower order systems are used to show that the magnitude and phase responses are same for both the systems.

Keywords: low-rank cholesky factor alternating directions implicit iteration, LTI Descriptor system, Lyapunov equations, Square-root balanced truncation

Procedia PDF Downloads 420
31547 Rights-Based Approach to Artificial Intelligence Design: Addressing Harm through Participatory ex ante Impact Assessment

Authors: Vanja Skoric

Abstract:

The paper examines whether the impacts of artificial intelligence (AI) can be meaningfully addressed through the rights-based approach to AI design, investigating in particular how the inclusive, participatory process of assessing the AI impact would make this viable. There is a significant gap between envisioning rights-based AI systems and their practical application. Plausibly, internalizing human rights approach within AI design process might be achieved through identifying and assessing implications of AI features human rights, especially considering the case of vulnerable individuals and communities. However, there is no clarity or consensus on how such an instrument should be operationalised to usefully identify the impact, mitigate harms and meaningfully ensure relevant stakeholders’ participation. In practice, ensuring the meaningful inclusion of those individuals, groups, or entire communities who are affected by the use of the AI system is a prerequisite for a process seeking to assess human rights impacts and risks. Engagement in the entire process of the impact assessment should enable those affected and interested to access information and better understand the technology, product, or service and resulting impacts, but also to learn about their rights and the respective obligations and responsibilities of developers and deployers to protect and/or respect these rights. This paper will provide an overview of the study and practice of the participatory design process for AI, including inclusive impact assessment, its main elements, propose a framework, and discuss the lessons learned from the existing theory. In addition, it will explore pathways for enhancing and promoting individual and group rights through such engagement by discussing when, how, and whom to include, at which stage of the process, and what are the pre-requisites for meaningful and engaging. The overall aim is to ensure using the technology that works for the benefit of society, individuals, and particular (historically marginalised) groups.

Keywords: rights-based design, AI impact assessment, inclusion, harm mitigation

Procedia PDF Downloads 155
31546 Molecular Dynamics Simulation for Buckling Analysis at Nanocomposite Beams

Authors: Babak Safaei, A. M. Fattahi

Abstract:

In the present study we have investigated axial buckling characteristics of nanocomposite beams reinforced by single-walled carbon nanotubes (SWCNTs). Various types of beam theories including Euler-Bernoulli beam theory, Timoshenko beam theory and Reddy beam theory were used to analyze the buckling behavior of carbon nanotube-reinforced composite beams. Generalized differential quadrature (GDQ) method was utilized to discretize the governing differential equations along with four commonly used boundary conditions. The material properties of the nanocomposite beams were obtained using molecular dynamic (MD) simulation corresponding to both short-(10,10) SWCNT and long-(10,10) SWCNT composites which were embedded by amorphous polyethylene matrix. Then the results obtained directly from MD simulations were matched with those calculated by the mixture rule to extract appropriate values of carbon nanotube efficiency parameters accounting for the scale-dependent material properties. The selected numerical results were presented to indicate the influences of nanotube volume fractions and end supports on the critical axial buckling loads of nanocomposite beams relevant to long- and short-nanotube composites.

Keywords: nanocomposites, molecular dynamics simulation, axial buckling, generalized differential quadrature (GDQ)

Procedia PDF Downloads 326
31545 The Development of Web Based Instruction on Puppet Show

Authors: Piyanut Sujit

Abstract:

The purposes of this study were to: 1) create knowledge and develop web based instruction on the puppet show, 2) evaluate the effectiveness of the web based instruction on the puppet show by using the criteria of 80/80, and 3) compare and analyze the achievement of the students before and after learning with web based instruction on the puppet show. The population of this study included 53 students in the Program of Library and Information Sciences who registered in the subject of Reading and Reading Promotion in semester 1/2011, Suansunandha Rajabhat University. The research instruments consisted of web based instruction on the puppet show, specialist evaluation form, achievement test, and tests during the lesson. The research statistics included arithmetic mean, variable means, standard deviation, and t-test in SPSS for Windows. The results revealed that the effectiveness of the developed web based instruction was 84.67/80.47 which was higher than the set criteria at 80/80. The student achievement before and after learning showed statistically significant difference at 0.05 as in the hypothesis.

Keywords: puppet, puppet show, web based instruction, library and information sciences

Procedia PDF Downloads 370
31544 Comparing the SALT and START Triage System in Disaster and Mass Casualty Incidents: A Systematic Review

Authors: Hendri Purwadi, Christine McCloud

Abstract:

Triage is a complex decision-making process that aims to categorize a victim’s level of acuity and the need for medical assistance. Two common triage systems have been widely used in Mass Casualty Incidents (MCIs) and disaster situation are START (Simple triage algorithm and rapid treatment) and SALT (sort, asses, lifesaving, intervention, and treatment/transport). There is currently controversy regarding the effectiveness of SALT over START triage system. This systematic review aims to investigate and compare the effectiveness between SALT and START triage system in disaster and MCIs setting. Literatures were searched via systematic search strategy from 2009 until 2019 in PubMed, Cochrane Library, CINAHL, Scopus, Science direct, Medlib, ProQuest. This review included simulated-based and medical record -based studies investigating the accuracy and applicability of SALT and START triage systems of adult and children population during MCIs and disaster. All type of studies were included. Joana Briggs institute critical appraisal tools were used to assess the quality of reviewed studies. As a result, 1450 articles identified in the search, 10 articles were included. Four themes were identified by review, they were accuracy, under-triage, over-triage and time to triage per individual victim. The START triage system has a wide range and inconsistent level of accuracy compared to SALT triage system (44% to 94. 2% of START compared to 70% to 83% of SALT). The under-triage error of START triage system ranged from 2.73% to 20%, slightly lower than SALT triage system (7.6 to 23.3%). The over-triage error of START triage system was slightly greater than SALT triage system (START ranged from 2% to 53% compared to 2% to 22% of SALT). The time for applying START triage system was faster than SALT triage system (START was 70-72.18 seconds compared to 78 second of SALT). Consequently; The START triage system has lower level of under-triage error and faster than SALT triage system in classifying victims of MCIs and disaster whereas SALT triage system is known slightly more accurate and lower level of over-triage. However, the magnitude of these differences is relatively small, and therefore the effect on the patient outcomes is not significance. Hence, regardless of the triage error, either START or SALT triage system is equally effective to triage victims of disaster and MCIs.

Keywords: disaster, effectiveness, mass casualty incidents, START triage system, SALT triage system

Procedia PDF Downloads 136
31543 Monitoring and Evaluation of Web-Services Quality and Medium-Term Impact on E-Government Agencies' Efficiency

Authors: A. F. Huseynov, N. T. Mardanov, J. Y. Nakhchivanski

Abstract:

This practical research is aimed to improve the management quality and efficiency of public administration agencies providing e-services. The monitoring system developed will provide continuous review of the websites compliance with the selected indicators, their evaluation based on the selected indicators and ranking of services according to the quality criteria. The responsible departments in the government agencies were surveyed; the questionnaire includes issues of management and feedback, e-services provided, and the application of information systems. By analyzing the main affecting factors and barriers, the recommendations will be given that lead to the relevant decisions to strengthen the state agencies competencies for the management and the provision of their services. Component 1. E-services monitoring system. Three separate monitoring activities are proposed to be executed in parallel: Continuous tracing of e-government sites using built-in web-monitoring program; this program generates several quantitative values which are basically related to the technical characteristics and the performance of websites. The expert assessment of e-government sites in accordance with the two general criteria. Criterion 1. Technical quality of the site. Criterion 2. Usability/accessibility (load, see, use). Each high-level criterion is in turn subdivided into several sub-criteria, such as: the fonts and the color of the background (Is it readable?), W3C coding standards, availability of the Robots.txt and the site map, the search engine, the feedback/contact and the security mechanisms. The on-line survey of the users/citizens – a small group of questions embedded in the e-service websites. The questionnaires comprise of the information concerning navigation, users’ experience with the website (whether it was positive or negative), etc. Automated monitoring of web-sites by its own could not capture the whole evaluation process, and should therefore be seen as a complement to expert’s manual web evaluations. All of the separate results were integrated to provide the complete evaluation picture. Component 2. Assessment of the agencies/departments efficiency in providing e-government services. - the relevant indicators to evaluate the efficiency and the effectiveness of e-services were identified; - the survey was conducted in all the governmental organizations (ministries, committees and agencies) that provide electronic services for the citizens or the businesses; - the quantitative and qualitative measures are covering the following sections of activities: e-governance, e-services, the feedback from the users, the information systems at the agencies’ disposal. Main results: 1. The software program and the set of indicators for internet sites evaluation has been developed and the results of pilot monitoring have been presented. 2. The evaluation of the (internal) efficiency of the e-government agencies based on the survey results with the practical recommendations related to the human potential, the information systems used and e-services provided.

Keywords: e-government, web-sites monitoring, survey, internal efficiency

Procedia PDF Downloads 308
31542 Rainwater Management in Smart City: Focus in Gomti Nagar Region, Lucknow, Uttar Pradesh, India

Authors: Priyanka Yadav, Rajkumar Ghosh, Alok Saini

Abstract:

Human civilization cannot exist and thrive in the absence of adequate water. As a result, even in smart cities, water plays an important role in human existence. The key causes of this catastrophic water scarcity crisis are lifestyle changes, over-exploitation of groundwater, water over usage, rapid urbanization, and uncontrolled population growth. Furthermore, salty water seeps into deeper aquifers, causing land subsidence. The purpose of this study on artificial groundwater recharge is to address the water shortage in Gomti Nagar, Lucknow. Submersibles are the most common methods of collecting freshwater from groundwater in Gomti Nagar neighbourhood of Lucknow. Gomti Nagar area has a groundwater depletion rate of 1968 m3/day/km2 and is categorized as Zone-A (very high levels) based on the existing groundwater abstraction pattern - A to D. Harvesting rainwater using roof top rainwater harvesting systems (RTRWHs) is an effective method for reducing aquifer depletion in a sustainable water management system. Rainwater collecting using roof top rainwater harvesting systems (RTRWHs) is an effective method for reducing aquifer depletion in a sustainable water conservation system. Due to a water imbalance of 24519 ML/yr, the Gomti Nagar region is facing severe groundwater depletion. According to the Lucknow Development Authority (LDA), the impact of installed RTRWHs (plot area 300 sq. m.) is 0.04 percent of rainfall collected through RTRWHs in Gomti Nagar region of Lucknow. When RTRWHs are deployed in all buildings, their influence will be greater. Bye-laws in India have mandated the installation of RTRWHs on plots greater than 300 sq.m. A better India without any water problem is a pipe dream that may be realized by installing residential and commercial rooftop rainwater collecting systems in every structure. According to the current study, RTRWHs should be used as an alternate source of water to bridge the gap between groundwater recharge and extraction in smart city viz. Gomti Nagar, Lucknow, India.

Keywords: groundwater recharge, RTRWHs, harvested rainwater, rainfall, water extraction

Procedia PDF Downloads 113
31541 Quantitative Phase Imaging System Based on a Three-Lens Common-Path Interferometer

Authors: Alexander Machikhin, Olga Polschikova, Vitold Pozhar, Alina Ramazanova

Abstract:

White-light quantitative phase imaging is an effective technique for achieving sub-nanometer phase sensitivity. Highly stable interferometers based on common-path geometry have been developed in recent years to solve this task. Some of these methods also apply multispectral approach. The purpose of this research is to suggest a simple and effective interferometer for such systems. We developed a three-lens common-path interferometer, which can be used for quantitative phase imaging with or without multispectral modality. The lens system consists of two components, the first one of which is a compound lens, consisting of two lenses. A pinhole is placed between the components. The lens-in-lens approach enables effective light transmission and high stability of the interferometer. The multispectrality is easily implemented by placing a tunable filter in front of the interferometer. In our work, we used an acousto-optical tunable filter. Some design considerations are discussed and multispectral quantitative phase retrieval is demonstrated.

Keywords: acousto-optical tunable filter, common-path interferometry, digital holography, multispectral quantitative phase imaging

Procedia PDF Downloads 312
31540 Music Genre Classification Based on Non-Negative Matrix Factorization Features

Authors: Soyon Kim, Edward Kim

Abstract:

In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.

Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)

Procedia PDF Downloads 303
31539 Nano-Sensors: Search for New Features

Authors: I. Filikhin, B. Vlahovic

Abstract:

We focus on a novel type of detection based on electron tunneling properties of double nanoscale structures in semiconductor materials. Semiconductor heterostructures as quantum wells (QWs), quantum dots (QDs), and quantum rings (QRs) may have energy level structure of several hundred of electron confinement states. The single electron spectra of the double quantum objects (DQW, DQD, and DQR) were studied in our previous works with relation to the electron localization and tunneling between the objects. The wave function of electron may be localized in one of the QDs or be delocalized when it is spread over the whole system. The localizing-delocalizing tunneling occurs when an electron transition between both states is possible. The tunneling properties of spectra differ strongly for “regular” and “chaotic” systems. We have shown that a small violation of the geometry drastically affects localization of electron. In particular, such violations lead to the elimination of the delocalized states of the system. The same symmetry violation effect happens if electrical or magnetic fields are applied. These phenomena could be used to propose a new type of detection based on the high sensitivity of charge transport between double nanostructures and small violations of the shapes. It may have significant technological implications.

Keywords: double quantum dots, single electron levels, tunneling, electron localizations

Procedia PDF Downloads 509
31538 Harmonic Mitigation and Total Harmonic Distortion Reduction in Grid-Connected PV Systems: A Case Study Using Real-Time Data and Filtering Techniques

Authors: Atena Tazikeh Lemeski, Ismail Ozdamar

Abstract:

This study presents a detailed analysis of harmonic distortion in a grid-connected photovoltaic (PV) system using real-time data captured from a solar power plant. Harmonics introduced by inverters in PV systems can degrade power quality and lead to increased Total Harmonic Distortion (THD), which poses challenges such as transformer overheating, increased power losses, and potential grid instability. This research addresses these issues by applying Fast Fourier Transform (FFT) to identify significant harmonic components and employing notch filters to target specific frequencies, particularly the 3rd harmonic (150 Hz), which was identified as the largest contributor to THD. Initial analysis of the unfiltered voltage signal revealed a THD of 21.15%, with prominent harmonic peaks at 150 Hz, 250 Hz and 350 Hz, corresponding to the 3rd, 5th, and 7th harmonics, respectively. After implementing the notch filters, the THD was reduced to 5.72%, demonstrating the effectiveness of this approach in mitigating harmonic distortion without affecting the fundamental frequency. This paper provides practical insights into the application of real-time filtering techniques in PV systems and their role in improving overall grid stability and power quality. The results indicate that targeted harmonic mitigation is crucial for the sustainable integration of renewable energy sources into modern electrical grids.

Keywords: grid-connected photovoltaic systems, fast Fourier transform, harmonic filtering, inverter-induced harmonics

Procedia PDF Downloads 46
31537 An AI-generated Semantic Communication Platform in HCI Course

Authors: Yi Yang, Jiasong Sun

Abstract:

Almost every aspect of our daily lives is now intertwined with some degree of human-computer interaction (HCI). HCI courses draw on knowledge from disciplines as diverse as computer science, psychology, design principles, anthropology, and more. Our HCI courses, named the Media and Cognition course, are constantly updated to reflect state-of-the-art technological advancements such as virtual reality, augmented reality, and artificial intelligence-based interactions. For more than a decade, our course has used an interest-based approach to teaching, in which students proactively propose some research-based questions and collaborate with teachers, using course knowledge to explore potential solutions. Semantic communication plays a key role in facilitating understanding and interaction between users and computer systems, ultimately enhancing system usability and user experience. The advancements in AI-generated technology, which have gained significant attention from both academia and industry in recent years, are exemplified by language models like GPT-3 that generate human-like dialogues from given prompts. Our latest version of the Human-Computer Interaction course practices a semantic communication platform based on AI-generated techniques. The purpose of this semantic communication is twofold: to extract and transmit task-specific information while ensuring efficient end-to-end communication with minimal latency. An AI-generated semantic communication platform evaluates the retention of signal sources and converts low-retain ability visual signals into textual prompts. These data are transmitted through AI-generated techniques and reconstructed at the receiving end; on the other hand, visual signals with a high retain ability rate are compressed and transmitted according to their respective regions. The platform and associated research are a testament to our students' growing ability to independently investigate state-of-the-art technologies.

Keywords: human-computer interaction, media and cognition course, semantic communication, retainability, prompts

Procedia PDF Downloads 124
31536 National Core Indicators - Aging and Disabilities: A Person-Centered Approach to Understanding Quality of Long-Term Services and Supports

Authors: Stephanie Giordano, Rosa Plasencia

Abstract:

In the USA, in 2013, public service systems such as Medicaid, aging, and disability systems undertook an effort to measure the quality of service delivery by examining the experiences and outcomes of those receiving public services. The goal of this effort was to develop a survey to measure the experiences and outcomes of those receiving public services, with the goal of measuring system performance for quality improvement. The performance indicators were developed through with input from directors of state aging and disability service systems, along with experts and stakeholders in the field across the United States. This effort, National Core Indicators –Aging and Disabilities (NCI-AD), grew out of National Core Indicators –Intellectual and Developmental Disabilities, an effort to measure developmental disability (DD) systems across the States. The survey tool and administration protocol underwent multiple rounds of testing and revision between 2013 and 2015. The measures in the final tool – called the Adult Consumer Survey (ACS) – emphasize not just important indicators of healthcare access and personal safety but also includes indicators of system quality based on person-centered outcomes. These measures indicate whether service systems support older adults and people with disabilities to live where they want, maintain relationships and engage in their communities and have choice and control in their everyday lives. Launched in 2015, the NCI-AD Adult Consumer Survey is now used in 23 states in the US. Surveys are conducted by NCI-AD trained surveyors via direct conversation with a person receiving public long-term services and supports (LTSS). Until 2020, surveys were only conducted in person. However, after a pilot to test the reliability of videoconference and telephone survey modes, these modes were adopted as an acceptable practice. The nature of the survey is that of a “guided conversation” survey administration allows for surveyor to use wording and terminology that is best understand by the person surveyed. The survey includes a subset of questions that may be answered by a proxy respondent who knows the person well if the person is receiving services in unable to provide valid responses on their own. Surveyors undergo a standardized training on survey administration to ensure the fidelity of survey administration. In addition to the main survey section, a Background Information section collects data on personal and service-related characteristics of the person receiving services; these data are typically collected through state administrative record. This information is helps provide greater context around the characteristics of people receiving services. It has also been used in conjunction with outcomes measures to look at disparity (including by race and ethnicity, gender, disability, and living arrangements). These measures of quality are critical for public service delivery systems to understand the unique needs of the population of older adults and improving the lives of older adults as well as people with disabilities. Participating states may use these data to identify areas for quality improvement within their service delivery systems, to advocate for specific policy change, and to better understand the experiences of specific populations of people served.

Keywords: quality of life, long term services and supports, person-centered practices, aging and disability research, survey methodology

Procedia PDF Downloads 124
31535 An Empirical Investigation of Montesquieu’s Theories on Climate

Authors: Lisa J. Piergallini

Abstract:

This project uses panel regression analyses to investigate the relationships between geography, institutions, and economic development, as guided by the theories of the 18th century French philosopher Montesquieu. Contemporary scholars of political economy perpetually misinterpret Montesquieu’s theories on climate, and in doing so they miss what could be the key to resolving the geography vs. institutions debate. There is a conspicuous gap in this literature, in that it does not consider whether geography and institutors might have an interactive, dynamic effect on economic development. This project seeks to bridge that gap. Data are used for all available countries over the years 1980-2013. Two interaction terms between geographic and institutional variables are employed within the empirical analyses, and these offer a unique contribution to the ongoing geography vs. institutions debate within the political economy literature. This study finds that there is indeed an interactive effect between geography and institutions, and that this interaction has a statistically significant effect on economic development. Democracy (as measured by Polity score) and rule of law and property rights (as measured by the Fraser index) have positive effects on economic development (as measured by GDP per capita), yet the magnitude of these effects are stronger in contexts where a low percent of the national population lives in the geographical tropics. This has implications for promoting economic development, and it highlights the importance of understanding geographical context.

Keywords: Montesquieu, institutions, geography, economic development, political philosophy, political economy

Procedia PDF Downloads 258
31534 Automated Feature Extraction and Object-Based Detection from High-Resolution Aerial Photos Based on Machine Learning and Artificial Intelligence

Authors: Mohammed Al Sulaimani, Hamad Al Manhi

Abstract:

With the development of Remote Sensing technology, the resolution of optical Remote Sensing images has greatly improved, and images have become largely available. Numerous detectors have been developed for detecting different types of objects. In the past few years, Remote Sensing has benefited a lot from deep learning, particularly Deep Convolution Neural Networks (CNNs). Deep learning holds great promise to fulfill the challenging needs of Remote Sensing and solving various problems within different fields and applications. The use of Unmanned Aerial Systems in acquiring Aerial Photos has become highly used and preferred by most organizations to support their activities because of their high resolution and accuracy, which make the identification and detection of very small features much easier than Satellite Images. And this has opened an extreme era of Deep Learning in different applications not only in feature extraction and prediction but also in analysis. This work addresses the capacity of Machine Learning and Deep Learning in detecting and extracting Oil Leaks from Flowlines (Onshore) using High-Resolution Aerial Photos which have been acquired by UAS fixed with RGB Sensor to support early detection of these leaks and prevent the company from the leak’s losses and the most important thing environmental damage. Here, there are two different approaches and different methods of DL have been demonstrated. The first approach focuses on detecting the Oil Leaks from the RAW Aerial Photos (not processed) using a Deep Learning called Single Shoot Detector (SSD). The model draws bounding boxes around the leaks, and the results were extremely good. The second approach focuses on detecting the Oil Leaks from the Ortho-mosaiced Images (Georeferenced Images) by developing three Deep Learning Models using (MaskRCNN, U-Net and PSP-Net Classifier). Then, post-processing is performed to combine the results of these three Deep Learning Models to achieve a better detection result and improved accuracy. Although there is a relatively small amount of datasets available for training purposes, the Trained DL Models have shown good results in extracting the extent of the Oil Leaks and obtaining excellent and accurate detection.

Keywords: GIS, remote sensing, oil leak detection, machine learning, aerial photos, unmanned aerial systems

Procedia PDF Downloads 37