Search results for: current spectral analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33448

Search results for: current spectral analysis

29158 An Integrated HCV Testing Model as a Method to Improve Identification and Linkage to Care in a Network of Community Health Centers in Philadelphia, PA

Authors: Catelyn Coyle, Helena Kwakwa

Abstract:

Objective: As novel and better tolerated therapies become available, effective HCV testing and care models become increasingly necessary to not only identify individuals with active infection but also link them to HCV providers for medical evaluation and treatment. Our aim is to describe an effective HCV testing and linkage to care model piloted in a network of five community health centers located in Philadelphia, PA. Methods: In October 2012, National Nursing Centers Consortium piloted a routine opt-out HCV testing model in a network of community health centers, one of which treats HCV, HIV, and co-infected patients. Key aspects of the model were medical assistant initiated testing, the use of laboratory-based reflex test technology, and electronic medical record modifications to prompt, track, report and facilitate payment of test costs. Universal testing on all adult patients was implemented at health centers serving patients at high-risk for HCV. The other sites integrated high-risk based testing, where patients meeting one or more of the CDC testing recommendation risk factors or had a history of homelessness were eligible for HCV testing. Mid-course adjustments included the integration of dual HIV testing, development of a linkage to care coordinator position to facilitate the transition of HIV and/or HCV-positive patients from primary to specialist care, and the transition to universal HCV testing across all testing sites. Results: From October 2012 to June 2015, the health centers performed 7,730 HCV tests and identified 886 (11.5%) patients with a positive HCV-antibody test. Of those with positive HCV-antibody tests, 838 (94.6%) had an HCV-RNA confirmatory test and 590 (70.4%) progressed to current HCV infection (overall prevalence=7.6%); 524 (88.8%) received their RNA-positive test result; 429 (72.7%) were referred to an HCV care specialist and 271 (45.9%) were seen by the HCV care specialist. The best linkage to care results were seen at the test and treat the site, where of the 333 patients were current HCV infection, 175 (52.6%) were seen by an HCV care specialist. Of the patients with active HCV infection, 349 (59.2%) were unaware of their HCV-positive status at the time of diagnosis. Since the integration of dual HCV/HIV testing in September 2013, 9,506 HIV tests were performed, 85 (0.9%) patients had positive HIV tests, 81 (95.3%) received their confirmed HIV test result and 77 (90.6%) were linked to HIV care. Dual HCV/HIV testing increased the number of HCV tests performed by 362 between the 9 months preceding dual testing and first 9 months after dual testing integration, representing a 23.7% increment. Conclusion: Our HCV testing model shows that integrated routine testing and linkage to care is feasible and improved detection and linkage to care in a primary care setting. We found that prevalence of current HCV infection was higher than that seen in locally in Philadelphia and nationwide. Intensive linkage services can increase the number of patients who successfully navigate the HCV treatment cascade. The linkage to care coordinator position is an important position that acts as a trusted intermediary for patients being linked to care.

Keywords: HCV, routine testing, linkage to care, community health centers

Procedia PDF Downloads 344
29157 The Moderation Effect of Critical Item on the Strategic Purchasing: Quality Performance Relationship

Authors: Kwong Yeung

Abstract:

Theories about strategic purchasing and quality performance are underdeveloped. Understanding the evolving role of purchasing from reactive to proactive is a pressing strategic issue. Using survey responses from 176 manufacturing and electronics industry professionals, we study the relationships between strategic purchasing and supply chain partners’ quality performance to answer the following questions: Can transaction cost economics be used to elucidate the strategic purchasing-quality performance relationship? Is this strategic purchasing-quality performance relationship moderated by critical item analysis? The findings indicate that critical item analysis positively and significantly moderates the strategic purchasing-quality performance relationship.

Keywords: critical item analysis, moderation, quality performance, strategic purchasing, transaction cost economics

Procedia PDF Downloads 551
29156 Coordinated Voltage Control in a Radial Distribution System

Authors: Shivarudraswamy, Anubhav Shrivastava, Lakshya Bhat

Abstract:

Distributed generation has indeed become a major area of interest in recent years. Distributed Generation can address large number of loads in a power line and hence has better efficiency over the conventional methods. However there are certain drawbacks associated with it, increase in voltage being the major one. This paper addresses the voltage control at the buses for an IEEE 30 bus system by regulating reactive power. For carrying out the analysis, the suitable location for placing distributed generators (DG) is identified through load flow analysis and seeing where the voltage profile is dipping. MATLAB programming is used to regulate the voltage at all buses within +/-5% of the base value even after the introduction of DG’s. Three methods for regulation of voltage are discussed. A sensitivity based analysis is later carried out to determine the priority among the various methods listed in the paper.

Keywords: distributed generators, distributed system, reactive power, voltage control

Procedia PDF Downloads 484
29155 Effects of Educational Technology Integration in Classroom Instruction to the Math Performance of Generation Z Students of a Private High School in the Philippines

Authors: May Maricel De Gracia

Abstract:

Different generations respond differently to instruction because of their diverse characteristics, learning styles and study habits. Teaching strategies that were effective many years ago may not be effective now especially to the current generation which is Gen Z. Using quantitative research design, the main goal of this paper is to determine the impact of the implementation of educational technology integration in a private high school in the math performance of its Junior High School (JHS) students on SY 2014-2018 based on their periodical exam performance and on their final math grades. In support, survey on the use of technology was administered to determine the characteristics of both students and teachers of SY 2017-2018. Another survey regarding study habits was also administered to the students to determine their readiness with regards to note-taking skills, time management, test taking/preparation skills, reading, and writing and math skills. Teaching strategies were recommended based on the need of the current Gen Z JHS students. A total of 712 JHS students and 12 math teachers participated in answering the different surveys. Periodic exam means and final math grades between the school years without technology (SY 2004-2008) and with technology (SY 2014-2018) were analyzed through correlation and regression analyses. Result shows that the periodic exam mean has a 35.29% impact to the final grade of the students. In addition, z-test result where p > 0.05 shows that the periodical exam results do not differ significantly between the school years without integration of technology and with the integration of technology. However, with p < 0.01, a significant positive difference was observed in the final math grades of students between the school years without technology integration and with technology integration.

Keywords: classroom instruction, technology, generation z, math performance

Procedia PDF Downloads 136
29154 An Analysis of Discourse Markers Awareness in Writing Undergraduate Thesis of English Education Student in Sebelas Maret University

Authors: Oktanika Wahyu Nurjanah, Anggun Fitriana Dewi

Abstract:

An undergraduate thesis is one of the academic writings which should fulfill some characteristics, one of them is coherency. Moreover, a coherence of a text depends on the usage of discourse markers. In other word, discourse markers take an essential role in writing. Therefore, the researchers aim to know the awareness of the discourse markers usage in writing the under-graduate thesis of an English Education student at Sebelas Maret University. This research uses a qualitative case study in order to obtain a deep analysis. The sample of this research is an under-graduate thesis of English Education student in Sebelas Maret University which chosen based on some criteria. Additionally, the researchers were guided by some literature attempted to group the discourse markers based on their functions. Afterward, the analysis was held based on it. From the analysis, it found that the awareness of discourse markers usage is moderate. The last point, the researcher suggest undergraduate students to familiarize themselves with discourse markers, especially for those who want to write thesis.

Keywords: discourse markers, English education, thesis writing, undergraduate student

Procedia PDF Downloads 342
29153 Modeling and Design of E-mode GaN High Electron Mobility Transistors

Authors: Samson Mil'shtein, Dhawal Asthana, Benjamin Sullivan

Abstract:

The wide energy gap of GaN is the major parameter justifying the design and fabrication of high-power electronic components made of this material. However, the existence of a piezo-electrics in nature sheet charge at the AlGaN/GaN interface complicates the control of carrier injection into the intrinsic channel of GaN HEMTs (High Electron Mobility Transistors). As a result, most of the transistors created as R&D prototypes and all of the designs used for mass production are D-mode devices which introduce challenges in the design of integrated circuits. This research presents the design and modeling of an E-mode GaN HEMT with a very low turn-on voltage. The proposed device includes two critical elements allowing the transistor to achieve zero conductance across the channel when Vg = 0V. This is accomplished through the inclusion of an extremely thin, 2.5nm intrinsic Ga₀.₇₄Al₀.₂₆N spacer layer. The added spacer layer does not create piezoelectric strain but rather elastically follows the variations of the crystal structure of the adjacent GaN channel. The second important factor is the design of a gate metal with a high work function. The use of a metal gate with a work function (Ni in this research) greater than 5.3eV positioned on top of n-type doped (Nd=10¹⁷cm⁻³) Ga₀.₇₄Al₀.₂₆N creates the necessary built-in potential, which controls the injection of electrons into the intrinsic channel as the gate voltage is increased. The 5µm long transistor with a 0.18µm long gate and a channel width of 30µm operate at Vd=10V. At Vg =1V, the device reaches the maximum drain current of 0.6mA, which indicates a high current density. The presented device is operational at frequencies greater than 10GHz and exhibits a stable transconductance over the full range of operational gate voltages.

Keywords: compound semiconductors, device modeling, enhancement mode HEMT, gallium nitride

Procedia PDF Downloads 248
29152 INCIPIT-CRIS: A Research Information System Combining Linked Data Ontologies and Persistent Identifiers

Authors: David Nogueiras Blanco, Amir Alwash, Arnaud Gaudinat, René Schneider

Abstract:

At a time when the access to and the sharing of information are crucial in the world of research, the use of technologies such as persistent identifiers (PIDs), Current Research Information Systems (CRIS), and ontologies may create platforms for information sharing if they respond to the need of disambiguation of their data by assuring interoperability inside and between other systems. INCIPIT-CRIS is a continuation of the former INCIPIT project, whose goal was to set up an infrastructure for a low-cost attribution of PIDs with high granularity based on Archival Resource Keys (ARKs). INCIPIT-CRIS can be interpreted as a logical consequence and propose a research information management system developed from scratch. The system has been created on and around the Schema.org ontology with a further articulation of the use of ARKs. It is thus built upon the infrastructure previously implemented (i.e., INCIPIT) in order to enhance the persistence of URIs. As a consequence, INCIPIT-CRIS aims to be the hinge between previously separated aspects such as CRIS, ontologies and PIDs in order to produce a powerful system allowing the resolution of disambiguation problems using a combination of an ontology such as Schema.org and unique persistent identifiers such as ARK, allowing the sharing of information through a dedicated platform, but also the interoperability of the system by representing the entirety of the data as RDF triplets. This paper aims to present the implemented solution as well as its simulation in real life. We will describe the underlying ideas and inspirations while going through the logic and the different functionalities implemented and their links with ARKs and Schema.org. Finally, we will discuss the tests performed with our project partner, the Swiss Institute of Bioinformatics (SIB), by the use of large and real-world data sets.

Keywords: current research information systems, linked data, ontologies, persistent identifier, schema.org, semantic web

Procedia PDF Downloads 118
29151 Phenomena-Based Approach for Automated Generation of Process Options and Process Models

Authors: Parminder Kaur Heer, Alexei Lapkin

Abstract:

Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.

Keywords: Phenomena, Process intensification, Process models , Process options

Procedia PDF Downloads 221
29150 Identifying E-Learning Components at North-West University, Mafikeng Campus

Authors: Sylvia Tumelo Nthutang, Nehemiah Mavetera

Abstract:

Educational institutions are under pressure from their competitors. Regulators and community groups need educational institutions to adopt appropriate business and organizational practices. Globally, educational institutions are now using e-learning as the best teaching and learning approach. E-learning is becoming the center of attention to the learning institutions, educational systems and software inventors. North-West University (NWU) is currently using eFundi, a Learning Management System (LMS). LMS are all information systems and procedures that adds value to students learning and support the learning material in text or any multimedia files. With various e-learning tools, students would be able to access all the materials related to the course in electronic copies. The study was tasked with identifying the e-learning components at the NWU, Mafikeng campus. Quantitative research methodology was considered in data collection and descriptive statistics for data analysis. The Activity Theory (AT) was used as a theory to guide the study. AT outlines the limitations amongst e-learning at the macro-organizational level (plan, guiding principle, campus-wide solutions) and micro-organization (daily functioning practice, collaborative transformation, specific adaptation). On a technological environment, AT gives people an opportunity to change from concentrating on computers as an area of concern but also understand that technology is part of human activities. The findings have identified the university’s current IT tools and knowledge on e-learning elements. It was recommended that university should consider buying computer resources that consumes less power and practice e-learning effectively.

Keywords: e-learning, information and communication technology (ICT), teaching, virtual learning environment

Procedia PDF Downloads 265
29149 Longitudinal Analysis of Internet Speed Data in the Gulf Cooperation Council Region

Authors: Musab Isah

Abstract:

This paper presents a longitudinal analysis of Internet speed data in the Gulf Cooperation Council (GCC) region, focusing on the most populous cities of each of the six countries – Riyadh, Saudi Arabia; Dubai, UAE; Kuwait City, Kuwait; Doha, Qatar; Manama, Bahrain; and Muscat, Oman. The study utilizes data collected from the Measurement Lab (M-Lab) infrastructure over a five-year period from January 1, 2019, to December 31, 2023. The analysis includes downstream and upstream throughput data for the cities, covering significant events such as the launch of 5G networks in 2019, COVID-19-induced lockdowns in 2020 and 2021, and the subsequent recovery period and return to normalcy. The results showcase substantial increases in Internet speeds across the cities, highlighting improvements in both download and upload throughput over the years. All the GCC countries have achieved above-average Internet speeds that can conveniently support various online activities and applications with excellent user experience.

Keywords: internet data science, internet performance measurement, throughput analysis, internet speed, measurement lab, network diagnostic tool

Procedia PDF Downloads 38
29148 Audit of Intraoperative Ventilation Strategy in Prolonged Abdominal Surgery

Authors: Prabir Patel, Eugene Ming Han Lim

Abstract:

Introduction: Current literature shows that postoperative pulmonary complications following abdominal surgery may be reduced by using lower than conventional tidal volumes intraoperatively together with moderate levels of positive end expiratory pressure (PEEP). The recent studies demonstrated significant reduction demonstrated significant reduction in major complications in elective abdominal surgery through the use of lower tidal volumes (6-8 ml/kg predicted body weight), PEEP of 5 cmH20 and recruitment manoeuvres compared to higher ‘conventional’ volumes (10-12 mls/kg PBW) without lung recruitment. Our objective was to retrospectively audit current practice for patients undergoing major abdominal surgery in Sir Charles Gairdner Hospital. Methods: Patients over 18 undergoing elective general surgery lasting more than 3 hours and intubated during the duration of procedure were included in this audit. Data was collected over a 6 month period. Patients who had hepatic surgery, procedures necessitating one-lung ventilation, transplant surgery, documented history of pulmonary or intracranial hypertension were excluded. Results: 58 suitable patients were identified and notes were available for 54 patients. Key findings: Average peak airway pressure was 21cmH20 (+4), average peak airway pressure was less than 30 cmH20 in all patients, and less than 25 cmH20 in 80% of the cases. PEEP was used in 81% of the cases. Where PEEP was used, 75% used PEEP more than or equal to 5 cmH20. Average tidal volume per actual body weight was 7.1 ml/kg (+1.6). Average tidal volume per predicted body weight (PBW) was 8.8 ml/kg (+1.5). Average tidal volume was less than 10 ml/kg PBW in 90% of cases; 6-8 ml/kg PBW in 40% of the cases. There was no recorded use of recruitment manoeuvres in any cases. Conclusions: In the vast majority of patients undergoing prolonged abdominal surgery, a lung protective strategy using moderate levels of PEEP, peak airway pressures of less than 30 cmH20 and tidal volumes of less than 10 cmH20/kg PBW was utilised. A recent randomised control trial demonstrated benefit from utilising even lower volumes (6-8 mls/kg) based on findings in critical care patients, but this was compared to volumes of 10-12 ml/kg. Volumes of 6-8 ml/kg PBW were utilised in 40% of cases in this audit. Although theoretically beneficial, clinical benefit of lower volumes than what is currently practiced in this institution remains to be seen. The incidence of pulmonary complications was much lower than in the other cited studies and a larger data set would be required to investigate any benefit from lower tidal volume ventilation. The volumes used are comparable to results from published local and international data but PEEP utilisation was higher in this audit. Strategies that may potentially be implemented to ensure and maintain best practice include pre-operative recording of predicted body weight, adjustment of default ventilator settings and education/updates of current evidence.

Keywords: anaesthesia, intraoperative ventilation, PEEP, tidal volume

Procedia PDF Downloads 754
29147 Integrating Data Envelopment Analysis and Variance Inflation Factor to Measure the Efficiency of Decision Making Units

Authors: Mostafa Kazemi, Zahra N. Farkhani

Abstract:

This paper proposes an integrated Data Envelopment Analysis (DEA) and Variance Inflation Factor (VIF) model for measuring the technical efficiency of decision making units. The model is validated using a set of 69% sales representatives’ dairy products. The analysis is done in two stages, in the first stage, VIF technique is used to distinguish independent effective factors of resellers, and in the second stage we used DEA for measuring efficiency for both constant and variable return to scales status. Further DEA is used to examine the utilization of environmental factors on efficiency. Results of this paper indicated an average managerial efficiency of 83% in the whole sales representatives’ dairy products. In addition, technical and scale efficiency were counted 96% and 80% respectively. 38% of sales representative have the technical efficiency of 100% and 72% of the sales representative in terms of managerial efficiency are quite efficient.High levels of relative efficiency indicate a good condition for sales representative efficiency.

Keywords: data envelopment analysis (DEA), relative efficiency, sales representatives’ dairy products, variance inflation factor (VIF)

Procedia PDF Downloads 547
29146 Agroecology: Rethink the Local in the Global to Promote the Creation of Novelties

Authors: Pauline Cuenin, Marcelo Leles Romarco Oliveira

Abstract:

Based on their localities and following their ecological rationality, family-based farmers have experimented, adapted and innovated to improve their production systems continuously for millennia. With the technological package transfer processes of the so-called Green Revolution for agricultural holdings, farmers have become increasingly dependent on ready-made "recipes" built from so-called "universal" and global knowledge to face the problems that emerge in the management of local agroecosystems, thus reducing their creative and experiential capacities. However, the production of novelties within farms is fundamental to the transition to more sustainable agro food systems. In fact, as the fruits of local knowledge and / or the contextualization of exogenous knowledge, novelties are seen as seeds of transition. By presenting new techniques, new organizational forms and epistemological approaches, agroecology was pointed out as a way to encourage and promote the creative capacity of farmers. From this perspective, this theoretical work aims to analyze how agroecology encourages the innovative capacity of farmers, and in general, the production of novelties. For this, an analysis was made of the theoretical and methodological bases of agroecology through a literature review, specifically looking for the way in which it articulates the local with the global, complemented by an analysis of agro ecological Brazilian experiences. It was emphasized that, based on the peasant way of doing agriculture, that is, on ecological / social co-evolution or still called co-production (interaction between human beings and living nature), agroecology recognizes and revalues peasant involves the deep interactions of the farmer with his site (bio-physical and social). As a "place science," practice and movement, it specifically takes into consideration the local and empirical knowledge of farmers, which allows questioning and modifying the paradigms that underpin the current agriculture that have disintegrated farmers' creative processes. In addition to upgrade the local, agroecology allows the dialogue of local knowledge with global knowledge, essential in the process of changes to get out of the dominant logic of thought and give shape to new experiences. In order to reach this articulation, agroecology involves new methodological focuses seeking participatory methods of study and intervention that express themselves in the form of horizontal spaces of socialization and collective learning that involve several actors with different knowledge. These processes promoted by agroecology favor the production of novelties at local levels for expansion at other levels, such as the global, through trans local agro ecological networks.

Keywords: agroecology, creativity, global, local, novelty

Procedia PDF Downloads 204
29145 Trends of Conservation and Development in Mexican Biosphere Reserves: Spatial Analysis and Linear Mixed Model

Authors: Cecilia Sosa, Fernanda Figueroa, Leonardo Calzada

Abstract:

Biosphere reserves (BR) are considered as the main strategy for biodiversity and ecosystems conservation. Mexican BR are mainly inhabited by rural communities who strongly depend on forests and their resources. Even though the dual objective of conservation and development has been sought in BR, land cover change is a common process in these areas, while most rural communities are highly marginalized, partly as a result of restrictions imposed by conservation to the access and use of resources. Achieving ecosystems conservation and social development face serious challenges. Factors such as financial support for development projects (public/private), environmental conditions, infrastructure and regional economic conditions might influence both land use change and wellbeing. Examining the temporal trends of conservation and development in BR is central for the evaluation of outcomes for these conservation strategies. In this study, we analyzed changes in primary vegetation cover (as a proxy for conservation) and the index of marginalization (as a proxy for development) in Mexican BR (2000-2015); we also explore the influence of various factors affecting these trends, such as conservation-development projects financial support (public or private), geographical distribution in ecoregions (as a proxy for shared environmental conditions) and in economic zones (as a proxy for regional economic conditions). We developed a spatial analysis at the municipal scale (2,458 municipalities nationwide) in ArcGIS, to obtain road densities, geographical distribution in ecoregions and economic zones, the financial support received, and the percent of municipality area under protection by protected areas and, particularly, by BR. Those municipalities with less than 25% of area under protection were regarded as part of the protected area. We obtained marginalization indexes for all municipalities and, using MODIS in Google Earth Engine, the number of pixels covered by primary vegetation. We used a linear mixed model in RStudio for the analysis. We found a positive correlation between the marginalization index and the percent of primary vegetation cover per year (r=0.49-0.5); i.e., municipalities with higher marginalization also show higher percent of primary vegetation cover. Also, those municipalities with higher area under protection have more development projects (r=0.46) and some environmental conditions were relevant for percent of vegetation cover. Time, economic zones and marginalization index were all important. Time was particularly, in 2005, when both marginalization and deforestation decreased. Road densities and financial support for conservation-development projects were irrelevant as factors in the general correlation. Marginalization is still being affected by the conservation strategies applied in BR, even though that this management category considers both conservation and development of local communities as its objectives. Our results suggest that roads densities and support for conservation-development projects have not been a factor of poverty alleviation. As better conservation is being attained in the most impoverished areas, we face the dilemma of how to improve wellbeing in rural communities under conservation, since current strategies have not been able to leave behind the conservation-development contraposition.

Keywords: deforestation, local development, marginalization, protected areas

Procedia PDF Downloads 116
29144 Generative Adversarial Network Based Fingerprint Anti-Spoofing Limitations

Authors: Yehjune Heo

Abstract:

Fingerprint Anti-Spoofing approaches have been actively developed and applied in real-world applications. One of the main problems for Fingerprint Anti-Spoofing is not robust to unseen samples, especially in real-world scenarios. A possible solution will be to generate artificial, but realistic fingerprint samples and use them for training in order to achieve good generalization. This paper contains experimental and comparative results with currently popular GAN based methods and uses realistic synthesis of fingerprints in training in order to increase the performance. Among various GAN models, the most popular StyleGAN is used for the experiments. The CNN models were first trained with the dataset that did not contain generated fake images and the accuracy along with the mean average error rate were recorded. Then, the fake generated images (fake images of live fingerprints and fake images of spoof fingerprints) were each combined with the original images (real images of live fingerprints and real images of spoof fingerprints), and various CNN models were trained. The best performances for each CNN model, trained with the dataset of generated fake images and each time the accuracy and the mean average error rate, were recorded. We observe that current GAN based approaches need significant improvements for the Anti-Spoofing performance, although the overall quality of the synthesized fingerprints seems to be reasonable. We include the analysis of this performance degradation, especially with a small number of samples. In addition, we suggest several approaches towards improved generalization with a small number of samples, by focusing on what GAN based approaches should learn and should not learn.

Keywords: anti-spoofing, CNN, fingerprint recognition, GAN

Procedia PDF Downloads 175
29143 Modelling Conceptual Quantities Using Support Vector Machines

Authors: Ka C. Lam, Oluwafunmibi S. Idowu

Abstract:

Uncertainty in cost is a major factor affecting performance of construction projects. To our knowledge, several conceptual cost models have been developed with varying degrees of accuracy. Incorporating conceptual quantities into conceptual cost models could improve the accuracy of early predesign cost estimates. Hence, the development of quantity models for estimating conceptual quantities of framed reinforced concrete structures using supervised machine learning is the aim of the current research. Using measured quantities of structural elements and design variables such as live loads and soil bearing pressures, response and predictor variables were defined and used for constructing conceptual quantities models. Twenty-four models were developed for comparison using a combination of non-parametric support vector regression, linear regression, and bootstrap resampling techniques. R programming language was used for data analysis and model implementation. Gross soil bearing pressure and gross floor loading were discovered to have a major influence on the quantities of concrete and reinforcement used for foundations. Building footprint and gross floor loading had a similar influence on beams and slabs. Future research could explore the modelling of other conceptual quantities for walls, finishes, and services using machine learning techniques. Estimation of conceptual quantities would assist construction planners in early resource planning and enable detailed performance evaluation of early cost predictions.

Keywords: bootstrapping, conceptual quantities, modelling, reinforced concrete, support vector regression

Procedia PDF Downloads 199
29142 Effects of Small Amount of Poly(D-Lactic Acid) on the Properties of Poly(L-Lactic Acid)/Microcrystalline Cellulose/Poly(D-Lactic Acid) Blends

Authors: Md. Hafezur Rahaman, Md. Sagor Hosen, Md. Abdul Gafur, Rasel Habib

Abstract:

This research is a systematic study of effects of poly(D-lactic acid) (PDLA) on the properties of poly(L-lactic acid)(PLLA)/microcrystalline cellulose (MCC)/PDLA blends by stereo complex crystallization. Blends were prepared with constant percentage of (3 percent) MCC and different percentage of PDLA by solution casting methods. These blends were characterized by Fourier Transform Infrared Spectroscopy (FTIR) for the confirmation of blends compatibility, Wide-Angle X-ray Scattering (WAXS) and scanning electron microscope (SEM) for the analysis of morphology, thermo-gravimetric analysis (TGA) and differential thermal analysis (DTA) for thermal properties measurement. FTIR Analysis results confirm no new characteristic absorption peaks appeared in the spectrum instead shifting of peaks due to hydrogen bonding help to have compatibility of blends component. Development of three new peaks from XRD analysis indicates strongly the formation of stereo complex crystallinity in the PLLA structure with the addition of PDLA. TGA and DTG results indicate that PDLA can improve the heat resistivity of the PLLA/MCC blends by increasing its degradation temperature. Comparison of DTA peaks also ensure developed thermal properties. Image of SEM shows the improvement of surface morphology.

Keywords: microcrystalline cellulose, poly(l-lactic acid), stereocomplex crystallization, thermal stability

Procedia PDF Downloads 122
29141 Identify the Renewable Energy Potential through Sustainability Indicators and Multicriteria Analysis

Authors: Camila Lima, Murilo Andrade Valle, Patrícia Teixeira Leite Asano

Abstract:

The growth in demand for electricity, caused by human development, depletion and environmental impacts caused by traditional sources of electricity generation have made new energy sources are increasingly encouraged and necessary for companies in the electricity sector. Based on this scenario, this paper assesses the negative environmental impacts associated with thermoelectric power plants in Brazil, pointing out the importance of using renewable energy sources, reducing environmental aggression. This article points out the existence of an energy alternative, wind energy, of the municipalities of São Paulo, represented by georeferenced maps with the help of GIS, using as a premise the indicators of sustainability and multicriteria analysis in the decision-making process.

Keywords: GIS (geographic information systems), multicriteria analysis, sustainability, wind energy

Procedia PDF Downloads 351
29140 The Non-Linear Analysis of Brain Response to Visual Stimuli

Authors: H. Namazi, H. T. N. Kuan

Abstract:

Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to visual stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to visual stimuli but provide us with very good recommendations for clinical purposes.

Keywords: visual stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 541
29139 Screening for Antibacterial Activity of Fungi from Indian Marine Environments: A Possible Alternative for New Antibiotics for the Treatment of Skin Microbial Infections

Authors: Shivankar Agrawal, Sunil Kumar Deshmukh, Colin Barrow, Alok Adholeya

Abstract:

Millions of people worldwide are affected by infectious diseases caused by bacteria and fungi. Skin and skin structure infections (SSSI) represent a significant category of infectious disease. Unexpectedly, many pathogens have developed resistance towards current antibiotics and over the time this problem has become more and more serious. All these new problems necessitate the continuous search for novel and alternative antibiotics and antifungals. The aim of our research is the screening of extracts of marine fungi for their antibacterial activity against bacteria causing skin and wound infection in humans. A total of 40 marine samples were collected from west coast and Andaman Island of India and 35 morphologically different marine fungi were isolated using natural sea water medium. Among 35 marine fungi, eight isolates exhibited significant antimicrobial activity against human pathogens. In the course of systematic screening program for bioactive marine fungi, strain 'D5' was found to be most potent strain with MIC value of 1 mg/mL, which was morphologically identified as Simplicillium lamellicola. The effects of the most active crude extracts against their susceptible test microorganisms were also investigated by SEM analysis. Purification and characterization of crude extracts for identification of active lead molecule is under process. The results of diversity and antimicrobial activity have increased the scope of finding industrially important marine fungi from Indian marine environments and these organisms could be vital sources for the discovery of pharmaceutically useful molecules.

Keywords: antimicrobial activity, antibiotic, marine fungi, skin infections

Procedia PDF Downloads 253
29138 Static and Dynamical Analysis on Clutch Discs on Different Material and Geometries

Authors: Jairo Aparecido Martins, Estaner Claro Romão

Abstract:

This paper presents the static and cyclic stresses in combination with fatigue analysis resultant of loads applied on the friction discs usually utilized on industrial clutches. The material chosen to simulate the friction discs under load is aluminum. The numerical simulation was done by software COMSOLTM Multiphysics. The results obtained for static loads showed enough stiffness for both geometries and the material utilized. On the other hand, in the fatigue standpoint, failure is clearly verified, what demonstrates the importance of both approaches, mainly dynamical analysis. The results and the conclusion are based on the stresses on disc, counted stress cycles, and fatigue usage factor.

Keywords: aluminum, industrial clutch, static and dynamic loading, numerical simulation

Procedia PDF Downloads 175
29137 Antioxidant Potency of Ethanolic Extracts from Selected Aromatic Plants by in vitro Spectrophotometric Analysis

Authors: Tatjana Kadifkova Panovska, Svetlana Kulevanova, Blagica Jovanova

Abstract:

Biological systems possess the ability to neutralize the excess of reactive oxygen species (ROS) and to protect cells from destructive alterations. However, many pathological conditions (cardiovascular diseases, autoimmune disorders, cancer) are associated with inflammatory processes that generate an excessive amount of reactive oxygen species (ROS) that shift the balance between endogenous antioxidant systems and free oxygen radicals in favor of the latter, leading to oxidative stress. Therefore, an additional source of natural compounds with antioxidant properties that will reduce the amount of ROS in cells is much needed despite their broad utilization; many plant species remain largely unexplored. Therefore, the purpose of the present study is to investigate the antioxidant activity of twenty-five selected medicinal and aromatic plant species. The antioxidant activity of the ethanol extracts was evaluated with in vitro assays: 2,2’-diphenyl-1-pycryl-hydrazyl (DPPH), ferric reducing antioxidant power (FRAP), non-site-specific- (NSSOH) and site-specific hydroxyl radical-2-deoxy-D-ribose degradation (SSOH) assays. The Folin-Ciocalteu method and AlCl3 method were performed to determine total phenolic content (TPC) and total flavonoid content (TFC). All examined plant extracts manifested antioxidant activity to a different extent. Cinnamomum verum J.Presl bark and Ocimum basilicum L. Herba demonstrated strong radical scavenging activity and reducing power with the DPPH and FRAP assay, respectively. Additionally, significant hydroxyl scavenging potential and metal chelating properties were observed using the NSSOH and SSOH assays. Furthermore, significant variations were determined in the total polyphenolic content (TPC) and total flavonoid content (TFC), with Cinnamomum verum and Ocimum basilicum showing the highest amount of total polyphenols. The considerably strong radical scavenging activity, hydroxyl scavenging potential and reducing power for the species mentioned above suggest of a presence of highly bioactive phytochemical compounds, predominantly polyphenols. Since flavonoids are the most abundant group of polyphenols that possess a large number of available reactive OH groups in their structure, it is considered that they are the main contributors to the radical scavenging properties of the examined plant extracts. This observation is supported by the positive correlation between the radical scavenging activity and the total polyphenolic and flavonoid content obtained in the current research. The observations from the current research nominate Cinnamomum verum bark and Ocimum basilicum herba as potential sources of bioactive compounds that could be utilized as antioxidative additives in the food and pharmaceutical industries. Moreover, the present study will help the researchers as basic data for future research in exploiting the hidden potential of these important plants that have not been explored so far.

Keywords: ethanol extracts, radical scavenging activity, reducing power, total polyphenols.

Procedia PDF Downloads 187
29136 Patient-Reported Adverse Reactions to Adolescent Non-Suicidal Self-Injury Disclosures and Implications for Clinical Practice

Authors: Renee Fabian, Jordan Davidson

Abstract:

Current research on non-suicidal self-injury (NSSI) provides ample insights on best practices for caregivers and clinicians to address and reduce NSSI behavior among adolescents. However, the efficacy of evidenced-based NSSI interventions and their delivery from the perspective of adolescent patients does not receive significant attention, creating a gap between the efficacy of research-based NSSI interventions and adolescent perceptions of NSSI treatment and adolescent willingness to engage in NSSI interventions. To address the gap between practice and patient perspectives and inform more effective treatment outcomes, the current survey aims to identify major patient-reported adverse reactions to NSSI disclosures from caregivers, treating mental health clinicians, and medical professionals using a mixed methods survey of 2,500 people with a history of NSSI completed by editors at a consumer-facing health publication. Based on the analyzed results of the survey, a majority of adolescents with a history of NSSI found parents and caregivers ineffective at empathetically addressing NSSI, and a significant number of participants reported at least one treating mental health professional inadequately responded to NSSI behaviors, in addition to other findings of adverse reactions to NSSI disclosures that serve as a barrier to treatment. NSSI is a significant risk factor for future suicide attempts. Addressing patient-reported adverse reactions to NSSI disclosures in the adolescent population can remove barriers to the effectiveness of caregiver and clinician NSSI interventions and reduce the risk of NSSI-related harm and lower the risk of future suicide attempts or completions.

Keywords: adolescent self-injury, non-suicidal self-injury, patient perspectives, self-harm interventions

Procedia PDF Downloads 97
29135 Hydrology and Hydraulics Analysis of Beko Abo Dam and Appurtenant Structre Design, Ethiopia

Authors: Azazhu Wassie

Abstract:

This study tried to evaluate the maximum design flood for appurtenance structure design using the given climatological and hydrological data analysis on the referenced study area. The maximum design flood is determined by using flood frequency analysis. Using this method, the peak discharge is 32,583.67 m3/s, but the data is transferred because the dam site is not on the gauged station. Then the peak discharge becomes 38,115 m3/s. The study was conducted in June 2023. This dam is built across a river to create a reservoir on its upstream side for impounding water. The water stored in the reservoir is used for various purposes, such as irrigation, hydropower, navigation, fishing, etc. The total average volume of annual runoff is estimated to be 115.1 billion m3. The total potential of the land for irrigation development can go beyond 3 million ha.

Keywords: dam design, flow duration curve, peak flood, rainfall, reservoir capacity, risk and reliability

Procedia PDF Downloads 8
29134 Study of Mobile Game Addiction Using Electroencephalography Data Analysis

Authors: Arsalan Ansari, Muhammad Dawood Idrees, Maria Hafeez

Abstract:

Use of mobile phones has been increasing considerably over the past decade. Currently, it is one of the main sources of communication and information. Initially, mobile phones were limited to calls and messages, but with the advent of new technology smart phones were being used for many other purposes including video games. Despite of positive outcomes, addiction to video games on mobile phone has become a leading cause of psychological and physiological problems among many people. Several researchers examined the different aspects of behavior addiction with the use of different scales. Objective of this study is to examine any distinction between mobile game addicted and non-addicted players with the use of electroencephalography (EEG), based upon psycho-physiological indicators. The mobile players were asked to play a mobile game and EEG signals were recorded by BIOPAC equipment with AcqKnowledge as data acquisition software. Electrodes were places, following the 10-20 system. EEG was recorded at sampling rate of 200 samples/sec (12,000samples/min). EEG recordings were obtained from the frontal (Fp1, Fp2), parietal (P3, P4), and occipital (O1, O2) lobes of the brain. The frontal lobe is associated with behavioral control, personality, and emotions. The parietal lobe is involved in perception, understanding logic, and arithmetic. The occipital lobe plays a role in visual tasks. For this study, a 60 second time window was chosen for analysis. Preliminary analysis of the signals was carried out with Acqknowledge software of BIOPAC Systems. From the survey based on CGS manual study 2010, it was concluded that five participants out of fifteen were in addictive category. This was used as prior information to group the addicted and non-addicted by physiological analysis. Statistical analysis showed that by applying clustering analysis technique authors were able to categorize the addicted and non-addicted players specifically on theta frequency range of occipital area.

Keywords: mobile game, addiction, psycho-physiology, EEG analysis

Procedia PDF Downloads 149
29133 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights

Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan

Abstract:

The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.

Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being

Procedia PDF Downloads 54
29132 Ergonomical Study of Hand-Arm Vibrational Exposure in a Gear Manufacturing Plant in India

Authors: Santosh Kumar, M. Muralidhar

Abstract:

The term ‘ergonomics’ is derived from two Greek words: ‘ergon’, meaning work and ‘nomoi’, meaning natural laws. Ergonomics is the study of how working conditions, machines and equipment can be arranged in order that people can work with them more efficiently. In this research communication an attempt has been made to study the effect of hand-arm vibrational exposure on the workers of a gear manufacturing plant by comparison of potential Carpal Tunnel Syndrome (CTS) symptoms and effect of different exposure levels of vibration on occurrence of CTS in actual industrial environment. Chi square test and correlation analysis have been considered for statistical analysis. From Chi square test, it has been found that the potential CTS symptoms occurrence is significantly dependent on the level of vibrational exposure. Data analysis indicates that 40.51% workers having potential CTS symptoms are exposed to vibration. Correlation analysis reveals that potential CTS symptoms are significantly correlated with exposure to level of vibration from handheld tools and to repetitive wrist movements.

Keywords: CTS symptoms, hand-arm vibration, ergonomics, physical tests

Procedia PDF Downloads 361
29131 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities

Authors: Anudeep Appe, Bhanu Poluparthi, Lakshmi Kasivajjula, Udai Mv, Sobha Bagadi, Punya Modi, Aditya Singh, Hemanth Gunupudi, Spenser Troiano, Jeff Paul, Justin Stovall, Justin Yamamoto

Abstract:

The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data is considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP, to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since its data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for ex. quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP (SHapley Additive exPlanations), a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.

Keywords: competition, DAGs, facility, healthcare, machine learning, market share, random forest, SHAP

Procedia PDF Downloads 78
29130 User Experience in Relation to Eye Tracking Behaviour in VR Gallery

Authors: Veslava Osinska, Adam Szalach, Dominik Piotrowski

Abstract:

Contemporary VR technologies allow users to explore virtual 3D spaces where they can work, socialize, learn, and play. User's interaction with GUI and the pictures displayed implicate perceptual and also cognitive processes which can be monitored due to neuroadaptive technologies. These modalities provide valuable information about the users' intentions, situational interpretations, and emotional states, to adapt an application or interface accordingly. Virtual galleries outfitted by specialized assets have been designed using the Unity engine BITSCOPE project in the frame of CHIST-ERA IV program. Users interaction with gallery objects implies the questions about his/her visual interests in art works and styles. Moreover, an attention, curiosity, and other emotional states are possible to be monitored and analyzed. Natural gaze behavior data and eye position were recorded by built-in eye-tracking module within HTC Vive headset gogle for VR. Eye gaze results are grouped due to various users’ behavior schemes and the appropriate perpetual-cognitive styles are recognized. Parallelly usability tests and surveys were adapted to identify the basic features of a user-centered interface for the virtual environments across most of the timeline of the project. A total of sixty participants were selected from the distinct faculties of University and secondary schools. Users’ primary knowledge about art and was evaluated during pretest and this way the level of art sensitivity was described. Data were collected during two months. Each participant gave written informed consent before participation. In data analysis reducing the high-dimensional data into a relatively low-dimensional subspace ta non linear algorithms were used such as multidimensional scaling and novel technique technique t-Stochastic Neighbor Embedding. This way it can classify digital art objects by multi modal time characteristics of eye tracking measures and reveal signatures describing selected artworks. Current research establishes the optimal place on aesthetic-utility scale because contemporary interfaces of most applications require to be designed in both functional and aesthetical ways. The study concerns also an analysis of visual experience for subsamples of visitors, differentiated, e.g., in terms of frequency of museum visits, cultural interests. Eye tracking data may also show how to better allocate artefacts and paintings or increase their visibility when possible.

Keywords: eye tracking, VR, UX, visual art, virtual gallery, visual communication

Procedia PDF Downloads 23
29129 Designing and Formulating Action Plan for Development of Corporate Citizenship in Producing Units in Iran

Authors: Freyedon Ahmadi

Abstract:

Corporate citizenship is considered as one of the most discussed topics in the developed countries, in which a citizen considers a Corporate just like a usual citizen with every civil right as respectful for corporate as for actual citizens, and in return citizens expect that corporate would pay a reciprocal respect to them. The current study’s purpose is to identify the impact of the current state of corporate citizenship along effective factors on its condition on industrial producing units, in order to find an accession plane for corporate citizenship development. In this study corporate citizenship is studied in four dimensions like legal corporate, economical corporate, ethical corporate and voluntary corporate. Moreover, effective factors’ impact on corporate citizenship is explored based on threefold dimensional model: behavioral, structural, and content factors, as well. In this study, 50 corporate of Food industry and of petrochemical industry, along with 200 selected individuals from directors’ board on Tehran province’s scale with stratified random sampling method, are chosen as actuarial sample. If based on functional goal and compilation methods, the present study is a description of correlation type; questionnaire is used for accumulation of initial Data. For Instrument Validity expert’s opinion is used and structural equations and its reliability is qualified by using Cronbach Alpha. The results of this study indicate that close to 70 percent of under survey corporate have not a good condition in corporate citizenship. And all of structural factors, behavioral factors, contextual factors, have a great deal of impression and impact on the advent corporate citizenship behavior in the producing Units. Among the behavioral factors, social responsibility; among structural factors, organic structure and human centered orientation, medium size, high organizational capacity; and among the contextual factors, the clientele’s positive viewpoints toward corporate had the utmost importance in impression on under survey Producing units.

Keywords: corporate citizenship, structural factors, behavioral factors, contextual factors, producing units

Procedia PDF Downloads 222