Search results for: real time digital simulator
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22664

Search results for: real time digital simulator

16034 A Comparative Study of the Impact of Membership in International Climate Change Treaties and the Environmental Kuznets Curve (EKC) in Line with Sustainable Development Theories

Authors: Mojtaba Taheri, Saied Reza Ameli

Abstract:

In this research, we have calculated the effect of membership in international climate change treaties for 20 developed countries based on the human development index (HDI) and compared this effect with the process of pollutant reduction in the Environmental Kuznets Curve (EKC) theory. For this purpose, the data related to The real GDP per capita with 2010 constant prices is selected from the World Development Indicators (WDI) database. Ecological Footprint (ECOFP) is the amount of biologically productive land needed to meet human needs and absorb carbon dioxide emissions. It is measured in global hectares (gha), and the data retrieved from the Global Ecological Footprint (2021) database will be used, and we will proceed by examining step by step and performing several series of targeted statistical regressions. We will examine the effects of different control variables, including Energy Consumption Structure (ECS) will be counted as the share of fossil fuel consumption in total energy consumption and will be extracted from The United States Energy Information Administration (EIA) (2021) database. Energy Production (EP) refers to the total production of primary energy by all energy-producing enterprises in one country at a specific time. It is a comprehensive indicator that shows the capacity of energy production in the country, and the data for its 2021 version, like the Energy Consumption Structure, is obtained from (EIA). Financial development (FND) is defined as the ratio of private credit to GDP, and to some extent based on the stock market value, also as a ratio to GDP, and is taken from the (WDI) 2021 version. Trade Openness (TRD) is the sum of exports and imports of goods and services measured as a share of GDP, and we use the (WDI) data (2021) version. Urbanization (URB) is defined as the share of the urban population in the total population, and for this data, we used the (WDI) data source (2021) version. The descriptive statistics of all the investigated variables are presented in the results section. Related to the theories of sustainable development, Environmental Kuznets Curve (EKC) is more significant in the period of study. In this research, we use more than fourteen targeted statistical regressions to purify the net effects of each of the approaches and examine the results.

Keywords: climate change, globalization, environmental economics, sustainable development, international climate treaty

Procedia PDF Downloads 66
16033 An Assessment of Factors Affecting the Cost and Time Performance of Subcontractors

Authors: Adedayo Jeremiah Adeyekun, Samuel Oluwagbemiga Ishola,

Abstract:

This paper is an assessment of factors influencing the cost and time performance of subcontractors and the need for effective performance of subcontractors at the project sites. The factors influencing the performance of subcontractors are grouped, similar to those identified with the project or an organization and on another hand, there are significant factors influencing the performance of the subcontractors. These factors incorporate management level leadership, time required to complete the project, profit, staff capability/expertise, reputation, installment method, organization history, and project procurement method strategy, security, bidding technique, insurance, bond and relationship with the major contractors. The factors influencing the management of subcontractors in building development projects includes performance of significant past projects, standard of workmanship, consistence with guidelines, regular payment to labourers, adherence to program, regularity and viability of communication with main contractor, adherence to subcontract necessities. Other factors comprise adherence to statutory environmental regulations, number of experienced sites administrative staff, inspection and maintenance of good workplace, number of artisans and workers, quality of as-built and shop drawings and ability to carry out the quantity of work and so on. This study also aimed to suggest a way forward to improve the performance of subcontractors which is the reason for exceeding budget at the project sites. To carry out this study, a questionnaire was drafted to derive information on the causes of low performance of subcontractors and the implication to cost.

Keywords: performance, contractor, subcontractors, construction

Procedia PDF Downloads 71
16032 Sync Consensus Algorithm: Trying to Reach an Agreement at Full Speed

Authors: Yuri Zinchenko

Abstract:

Recently, distributed storage systems have been used more and more in various aspects of everyday life. They provide such necessary properties as Scalability, Fault Tolerance, Durability, and others. At the same time, not only reliable but also fast data storage remains one of the most pressing issues in this area. That brings us to the consensus algorithm as one of the most important components that has a great impact on the functionality of a distributed system. This paper is the result of an analysis of several well-known consensus algorithms, such as Paxos and Raft. The algorithm it offers, called Sync, promotes, but does not insist on simultaneous writing to the nodes (which positively affects the overall writing speed) and tries to minimize the system's inactive time. This allows nodes to reach agreement on the system state in a shorter period, which is a critical factor for distributed systems. Also when developing Sync, a lot of attention was paid to such criteria as simplicity and intuitiveness, the importance of which is difficult to overestimate.

Keywords: sync, consensus algorithm, distributed system, leader-based, synchronization.

Procedia PDF Downloads 56
16031 Ultra-Wideband Antennas for Ultra-Wideband Communication and Sensing Systems

Authors: Meng Miao, Jeongwoo Han, Cam Nguyen

Abstract:

Ultra-wideband (UWB) time-domain impulse communication and radar systems use ultra-short duration pulses in the sub-nanosecond regime, instead of continuous sinusoidal waves, to transmit information. The pulse directly generates a very wide-band instantaneous signal with various duty cycles depending on specific usages. In UWB systems, the total transmitted power is spread over an extremely wide range of frequencies; the power spectral density is extremely low. This effectively results in extremely small interference to other radio signals while maintains excellent immunity to interference from these signals. UWB devices can therefore work within frequencies already allocated for other radio services, thus helping to maximize this dwindling resource. Therefore, impulse UWB technique is attractive for realizing high-data-rate, short-range communications, ground penetrating radar (GPR), and military radar with relatively low emission power levels. UWB antennas are the key element dictating the transmitted and received pulse shape and amplitude in both time and frequency domain. They should have good impulse response with minimal distortion. To facilitate integration with transmitters and receivers employing microwave integrated circuits, UWB antennas enabling direct integration are preferred. We present the development of two UWB antennas operating from 3.1 to 10.6 GHz and 0.3-6 GHz for UWB systems that provide direct integration with microwave integrated circuits. The operation of these antennas is based on the principle of wave propagation on a non-uniform transmission line. Time-domain EM simulation is conducted to optimize the antenna structures to minimize reflections occurring at the open-end transition. Calculated and measured results of these UWB antennas are presented in both frequency and time domains. The antennas have good time-domain responses. They can transmit and receive pulses effectively with minimum distortion, little ringing, and small reflection, clearly demonstrating the signal fidelity of the antennas in reproducing the waveform of UWB signals which is critical for UWB sensors and communication systems. Good performance together with seamless microwave integrated-circuit integration makes these antennas good candidates not only for UWB applications but also for integration with printed-circuit UWB transmitters and receivers.

Keywords: antennas, ultra-wideband, UWB, UWB communication systems, UWB radar systems

Procedia PDF Downloads 233
16030 Tracing the Concept of Equivalence in Translation Theories from the Linguistics Oriented Era to Present

Authors: Fatma Ülkü Kavruk

Abstract:

The comparison of the old and new approaches reveals that the concept of equivalence has been interpreted and categorized in different ways by different scholars throughout the history. The aim of this study is to trace the concept of equivalence in translation theories from the linguistics-oriented era to present, referring to various translation scholars and to provide a critical evaluation of the nature and applicability of the concept of equivalence in today’s world of translation studies. Within the study, various interpretations of equivalence proposed by international scholars in translation studies are to be presented. In order to find out the reflections of these scholars’ approaches to the Turkish scholars’ research, the interpretations of equivalence by various Turkish scholars are to be examined. At the end of the paper, the applicability of the concept of equivalence in real life is to be discussed in light of these approaches.

Keywords: translation studies, equivalence, translation theories, evaluation

Procedia PDF Downloads 486
16029 Survey of Prevalence of Noise Induced Hearing Loss in Hawkers and Shopkeepers in Noisy Areas of Mumbai City

Authors: Hitesh Kshayap, Shantanu Arya, Ajay Basod, Sachin Sakhuja

Abstract:

This study was undertaken to measure the overall noise levels in different locations/zones and to estimate the prevalence of Noise induced hearing loss in Hawkers & Shopkeepers in Mumbai, India. The Hearing Test developed by American Academy Of Otolaryngology, translated from English to Hindi, and validated is used as a screening tool for hearing sensitivity was employed. The tool is having 14 items. Each item is scored on a scale 0, 1, 2 and 3. The score 6 and above indicated some difficulty or definite difficulty in hearing in daily activities and low score indicated lesser difficulty or normal hearing. The subjects who scored 6 or above or having tinnitus were made to undergo hearing evaluation by Pure tone audiometer. Further, the environmental noise levels were measured from Morning to Evening at road side at different Location/Hawking zones in Mumbai city using SLM9 Agronic 8928B & K type Digital Sound Level Meter) in dB (A). The maximum noise level of 100.0 dB (A) was recorded during evening hours from Chattrapati Shivaji Terminal to Colaba with overall noise level of 79.0 dB (A). However, the minimum noise level in this area was 72.6 dB (A) at any given point of time. Further, 54.6 dB (A) was recorded as minimum noise level during 8-9 am at Sion Circle. Further, commencement of flyovers with 2-tier traffic, sky walks, increasing number of vehicular traffic at road, high rise buildings and other commercial & urbanization activities in the Mumbai city most probably have resulted in increasing the overall environmental noise levels. Trees which acted as noise absorbers have been cut owing to rapid construction. The study involved 100 participants in the age range of 18 to 40 years of age, with the mean age of 29 years (S.D. =6.49). 46 participants having tinnitus or have obtained the score of 6 were made to undergo Pure Tone Audiometry and it was found that the prevalence rate of hearing loss in hawkers & shopkeepers is 19% (10% Hawkers and 9 % Shopkeepers). The results found indicates that 29 (42.6%) out of 64 Hawkers and 17 (47.2%) out of 36 Shopkeepers who underwent PTA had no significant difference in percentage of Noise Induced Hearing loss. The study results also reveal that participants who exhibited tinnitus 19 (41.30%) out of 46 were having mild to moderate sensorineural hearing loss between 3000Hz to 6000Hz. The Pure tone Audiogram pattern revealed Hearing loss at 4000 Hz and 6000 Hz while hearing at adjacent frequencies were nearly normal. 7 hawkers and 8 shopkeepers had mild notch while 3 hawkers and 1 shopkeeper had a moderate degree of notch. It is thus inferred that tinnitus is a strong indicator for presence of hearing loss and 4/6 KHz notch is a strong marker for road/traffic/ environmental noise as an occupational hazard for hawkers and shopkeepers. Mass awareness about these occupational hazards, regular hearing check up, early intervention along with sustainable development juxtaposed with social and urban forestry can help in this regard.

Keywords: NIHL, noise, sound level meter, tinnitus

Procedia PDF Downloads 192
16028 Automatic Seizure Detection Using Weighted Permutation Entropy and Support Vector Machine

Authors: Noha Seddik, Sherine Youssef, Mohamed Kholeif

Abstract:

The automated epileptic seizure detection research field has emerged in the recent years; this involves analyzing the Electroencephalogram (EEG) signals instead of the traditional visual inspection performed by expert neurologists. In this study, a Support Vector Machine (SVM) that uses Weighted Permutation Entropy (WPE) as the input feature is proposed for classifying normal and seizure EEG records. WPE is a modified statistical parameter of the permutation entropy (PE) that measures the complexity and irregularity of a time series. It incorporates both the mapped ordinal pattern of the time series and the information contained in the amplitude of its sample points. The proposed system utilizes the fact that entropy based measures for the EEG segments during epileptic seizure are lower than in normal EEG.

Keywords: electroencephalogram (EEG), epileptic seizure detection, weighted permutation entropy (WPE), support vector machine (SVM)

Procedia PDF Downloads 363
16027 Time Efficient Color Coding for Structured-Light 3D Scanner

Authors: Po-Hao Huang, Pei-Ju Chiang

Abstract:

The structured light 3D scanner is commonly used for measuring the 3D shape of an object. Through projecting designed light patterns on the object, deformed patterns can be obtained and used for the geometric shape reconstruction. At present, Gray code is the most reliable and commonly used light pattern in the structured light 3D scanner. However, the trade-off between scanning efficiency and accuracy is a long-standing and challenging problem. The design of light patterns plays a significant role in the scanning efficiency and accuracy. Thereby, we proposed a novel encoding method integrating color information and Gray-code to improve the scanning efficiency. We will demonstrate that with the proposed method, the scanning time can be reduced to approximate half of the one needed by Gray-code without reduction of precision.

Keywords: gray-code, structured light scanner, 3D shape acquisition, 3D reconstruction

Procedia PDF Downloads 454
16026 Ammonia Bunkering Spill Scenarios: Modelling Plume’s Behaviour and Potential to Trigger Harmful Algal Blooms in the Singapore Straits

Authors: Bryan Low

Abstract:

In the coming decades, the global maritime industry will face a most formidable environmental challenge -achieving net zero carbon emissions by 2050. To meet this target, the Maritime Port Authority of Singapore (MPA) has worked to establish green shipping and digital corridors with ports of several other countries around the world where ships will use low-carbon alternative fuels such as ammonia for power generation. While this paradigm shift to the bunkering of greener fuels is encouraging, fuels like ammonia will also introduce a new and unique type of environmental risk in the unlikely scenario of a spill. While numerous modelling studies have been conducted for oil spills and their associated environmental impact on coastal and marine ecosystems, ammonia spills are comparatively less well understood. For example, there is a knowledge gap regarding how the complex hydrodynamic conditions of the Singapore Straits may influence the dispersion of a hypothetical ammonia plume, which has different physical and chemical properties compared to an oil slick. Chemically, ammonia can be absorbed by phytoplankton, thus altering the balance of the marine nitrogen cycle. Biologically, ammonia generally serves the role of a nutrient in coastal ecosystems at lower concentrations. However, at higher concentrations, it has been found to be toxic to many local species. It may also have the potential to trigger eutrophication and harmful algal blooms (HABs) in coastal waters, depending on local hydrodynamic conditions. Thus, the key objective of this research paper is to support the development of a model-based forecasting system that can predict ammonia plume behaviour in coastal waters, given prevailing hydrodynamic conditions and their environmental impact. This will be essential as ammonia bunkering becomes more commonplace in Singapore’s ports and around the world. Specifically, this system must be able to assess the HAB-triggering potential of an ammonia plume, as well as its lethal and sub-lethal toxic effects on local species. This will allow the relevant authorities to better plan risk mitigation measures or choose a time window with the ideal hydrodynamic conditions to conduct ammonia bunkering operations with minimal risk. In this paper, we present the first part of such a forecasting system: a jointly coupled hydrodynamic-water quality model that can capture how advection-diffusion processes driven by ocean currents influence plume behaviour and how the plume interacts with the marine nitrogen cycle. The model is then applied to various ammonia spill scenarios where the results are discussed in the context of current ammonia toxicity guidelines, impact on local ecosystems, and mitigation measures for future bunkering operations conducted in the Singapore Straits.

Keywords: ammonia bunkering, forecasting, harmful algal blooms, hydrodynamics, marine nitrogen cycle, oceanography, water quality modeling

Procedia PDF Downloads 75
16025 Risks beyond Cyber in IoT Infrastructure and Services

Authors: Mattias Bergstrom

Abstract:

Significance of the Study: This research will provide new insights into the risks with digital embedded infrastructure. Through this research, we will analyze each risk and its potential negation strategies, especially for AI and autonomous automation. Moreover, the analysis that is presented in this paper will convey valuable information for future research that can create more stable, secure, and efficient autonomous systems. To learn and understand the risks, a large IoT system was envisioned, and risks with hardware, tampering, and cyberattacks were collected, researched, and evaluated to create a comprehensive understanding of the potential risks. Potential solutions have then been evaluated on an open source IoT hardware setup. This list shows the identified passive and active risks evaluated in the research. Passive Risks: (1) Hardware failures- Critical Systems relying on high rate data and data quality are growing; SCADA systems for infrastructure are good examples of such systems. (2) Hardware delivers erroneous data- Sensors break, and when they do so, they don’t always go silent; they can keep going, just that the data they deliver is garbage, and if that data is not filtered out, it becomes disruptive noise in the system. (3) Bad Hardware injection- Erroneous generated sensor data can be pumped into a system by malicious actors with the intent to create disruptive noise in critical systems. (4) Data gravity- The weight of the data collected will affect Data-Mobility. (5) Cost inhibitors- Running services that need huge centralized computing is cost inhibiting. Large complex AI can be extremely expensive to run. Active Risks: Denial of Service- It is one of the most simple attacks, where an attacker just overloads the system with bogus requests so that valid requests disappear in the noise. Malware- Malware can be anything from simple viruses to complex botnets created with specific goals, where the creator is stealing computer power and bandwidth from you to attack someone else. Ransomware- It is a kind of malware, but it is so different in its implementation that it is worth its own mention. The goal with these pieces of software is to encrypt your system so that it can only be unlocked with a key that is held for ransom. DNS spoofing- By spoofing DNS calls, valid requests and data dumps can be sent to bad destinations, where the data can be extracted for extortion or to corrupt and re-inject into a running system creating a data echo noise loop. After testing multiple potential solutions. We found that the most prominent solution to these risks was to use a Peer 2 Peer consensus algorithm over a blockchain to validate the data and behavior of the devices (sensors, storage, and computing) in the system. By the devices autonomously policing themselves for deviant behavior, all risks listed above can be negated. In conclusion, an Internet middleware that provides these features would be an easy and secure solution to any future autonomous IoT deployments. As it provides separation from the open Internet, at the same time, it is accessible over the blockchain keys.

Keywords: IoT, security, infrastructure, SCADA, blockchain, AI

Procedia PDF Downloads 103
16024 A Human Factors Approach to Workload Optimization for On-Screen Review Tasks

Authors: Christina Kirsch, Adam Hatzigiannis

Abstract:

Rail operators and maintainers worldwide are increasingly replacing walking patrols in the rail corridor with mechanized track patrols -essentially data capture on trains- and on-screen reviews of track infrastructure in centralized review facilities. The benefit is that infrastructure workers are less exposed to the dangers of the rail corridor. The impact is a significant change in work design from walking track sections and direct observation in the real world to sedentary jobs in the review facility reviewing captured data on screens. Defects in rail infrastructure can have catastrophic consequences. Reviewer performance regarding accuracy and efficiency of reviews within the available time frame is essential to ensure safety and operational performance. Rail operators must optimize workload and resource loading to transition to on-screen reviews successfully. Therefore, they need to know what workload assessment methodologies will provide reliable and valid data to optimize resourcing for on-screen reviews. This paper compares objective workload measures, including track difficulty ratings and review distance covered per hour, and subjective workload assessments (NASA TLX) and analyses the link between workload and reviewer performance, including sensitivity, precision, and overall accuracy. An experimental study was completed with eight on-screen reviewers, including infrastructure workers and engineers, reviewing track sections with different levels of track difficulty over nine days. Each day the reviewers completed four 90-minute sessions of on-screen inspection of the track infrastructure. Data regarding the speed of review (km/ hour), detected defects, false negatives, and false positives were collected. Additionally, all reviewers completed a subjective workload assessment (NASA TLX) after each 90-minute session and a short employee engagement survey at the end of the study period that captured impacts on job satisfaction and motivation. The results showed that objective measures for tracking difficulty align with subjective mental demand, temporal demand, effort, and frustration in the NASA TLX. Interestingly, review speed correlated with subjective assessments of physical and temporal demand, but to mental demand. Subjective performance ratings correlated with all accuracy measures and review speed. The results showed that subjective NASA TLX workload assessments accurately reflect objective workload. The analysis of the impact of workload on performance showed that subjective mental demand correlated with high precision -accurately detected defects, not false positives. Conversely, high temporal demand was negatively correlated with sensitivity and the percentage of detected existing defects. Review speed was significantly correlated with false negatives. With an increase in review speed, accuracy declined. On the other hand, review speed correlated with subjective performance assessments. Reviewers thought their performance was higher when they reviewed the track sections faster, despite the decline in accuracy. The study results were used to optimize resourcing and ensure that reviewers had enough time to review the allocated track sections to improve defect detection rates in accordance with the efficiency-thoroughness trade-off. Overall, the study showed the importance of a multi-method approach to workload assessment and optimization, combining subjective workload assessments with objective workload and performance measures to ensure that recommendations for work system optimization are evidence-based and reliable.

Keywords: automation, efficiency-thoroughness trade-off, human factors, job design, NASA TLX, performance optimization, subjective workload assessment, workload analysis

Procedia PDF Downloads 116
16023 Addiction Counseling Resources: A Qualitative Study

Authors: Cailyn Green

Abstract:

Substance use counselors have a variety of fast-paced tasks and responsibilities. Professional resources are designed to support professionals in making their job duties easier and less stressful. The purpose of this research was to identify what types of resources would support addiction counselors in performing their job duties. Counselors often must jump in and facilitate a group counseling session with little to no time for prep. This causes stress and creates pressure to come up with a clinical group activity in little time. The researcher utilized qualitative interviews focused on identifying what types of resources would support addiction counselors in doing their jobs easier and effectively. The researcher visited 23 different addiction counseling facilities seeking participants for the interviews. Altogether 15 interviews were collected across six different substance-use counseling facilities. The interviews guided the researcher toward creating an open education resource (OER) of group activities for addiction counselors to utilize.

Keywords: addiction, counseling, resources, OER, treatment

Procedia PDF Downloads 71
16022 Managing High-Performance Virtual Teams

Authors: Mehdi Rezai, Asghar Zamani

Abstract:

Virtual teams are a reality in today’s fast-paced world. With the possibility of commonly using common resources, an increase of inter-organizational projects, cooperation, outsourcing, and the increase in the number of people who work remotely or flexitime, an extensive and active presence of high-performance teams is a must. Virtual teams are a challenge by themselves. Their members remove the barriers of cultures, time regions and organizations, and they often communicate through electronic devices over considerable distances. Firstly, we examine the management of virtual teams by considering different issues such as cultural and personal diversities, communications and arrangement issues. Then we will examine individuals, processes and the existing tools in a team. The main challenge is managing high-performance virtual teams. First of all, we must examine the concept of performance. Then, we must focus on teams and the best methods of managing them. Constant improvement of performance, together with precisely regulating every individual’s method of working, increases the levels of performance in the course of time. High-performance teams exploit every issue as an opportunity for achieving high performance. And we know that doing projects with high performance is among every organization or team’s objectives. Performance could be measured using many criteria, among which carrying out projects in time, the satisfaction of stakeholders, and not exceeding budgets could be named. Elements such as clear objectives, clearly-defined roles and responsibilities, effective communications, and commitment to collaboration are essential to a team’s effectiveness. Finally, we will examine roles, systems, processes and will carry out a cause-and-effect analysis of different criteria in improving a team’s performance.

Keywords: virtual teams, performance, management, process, improvement, effectiveness

Procedia PDF Downloads 146
16021 Non-Invasive Imaging of Tissue Using Near Infrared Radiations

Authors: Ashwani Kumar Aggarwal

Abstract:

NIR Light is non-ionizing and can pass easily through living tissues such as breast without any harmful effects. Therefore, use of NIR light for imaging the biological tissue and to quantify its optical properties is a good choice over other invasive methods. Optical tomography involves two steps. One is the forward problem and the other is the reconstruction problem. The forward problem consists of finding the measurements of transmitted light through the tissue from source to detector, given the spatial distribution of absorption and scattering properties. The second step is the reconstruction problem. In X-ray tomography, there is standard method for reconstruction called filtered back projection method or the algebraic reconstruction methods. But this method cannot be applied as such, in optical tomography due to highly scattering nature of biological tissue. A hybrid algorithm for reconstruction has been implemented in this work which takes into account the highly scattered path taken by photons while back projecting the forward data obtained during Monte Carlo simulation. The reconstructed image suffers from blurring due to point spread function. This blurred reconstructed image has been enhanced using a digital filter which is optimal in mean square sense.

Keywords: least-squares optimization, filtering, tomography, laser interaction, light scattering

Procedia PDF Downloads 311
16020 Topology-Based Character Recognition Method for Coin Date Detection

Authors: Xingyu Pan, Laure Tougne

Abstract:

For recognizing coins, the graved release date is important information to identify precisely its monetary type. However, reading characters in coins meets much more obstacles than traditional character recognition tasks in the other fields, such as reading scanned documents or license plates. To address this challenging issue in a numismatic context, we propose a training-free approach dedicated to detection and recognition of the release date of the coin. In the first step, the date zone is detected by comparing histogram features; in the second step, a topology-based algorithm is introduced to recognize coin numbers with various font types represented by binary gradient map. Our method obtained a recognition rate of 92% on synthetic data and of 44% on real noised data.

Keywords: coin, detection, character recognition, topology

Procedia PDF Downloads 249
16019 Pentax Airway Scope Video Laryngoscope for Orotracheal Intubation in Children: A Randomized Controlled Trial

Authors: In Kyong Yi, Yun Jeong Chae, Jihoon Hwang, Sook-Young Lee, Jong-Yeop Kim

Abstract:

Background: Pentax airway scope (AWS) is a recently developed video laryngoscope for use in both normal and difficult airways, providing a good laryngeal view. The purpose of this randomized noninferior study was to evaluate the efficacy of the Pentax-AWS regarding intubation time, laryngeal view and ease of intubation in pediatric patients with normal airway, compared to Macintosh laryngoscope. Method: A total of 136 pediatric patients aged 1 to 10 with American Society of Anesthesiologists physical status I or II undergoing general anesthesia required orotracheal intubation were randomly allocated into two groups: Macintosh laryngoscope (n =68) and Pentax AWS (n=68). Anesthesia was induced with propofol, rocuronium, and sevoflurane. The primary outcome was intubation time. Cormack-Lehane laryngeal view grade, application of optimal laryngeal external manipulation (OELM), intubation difficulty scale (IDS), intubation failure rate and adverse events were also measured. Result: No significant difference was observed between the two groups regarding intubation time (Macintosh; 23[22-26] sec vs. Pentax; 23.5[22-27.75] sec, p=0.713). As for the laryngeal view grade, the Pentax group showed less number of grade 2a or higher grade cases compared to the Macintosh group (1/2a/2b/3; 52.9%/41.2%/4.4%/1.5% vs. 98.5%/1.5%/0%/0%, p=0.000). No optimal laryngeal external manipulation application was required in the Pentax group (38.2% vs. 0%, p=0.000). Intubation difficulty scale resulted in lower values for Pentax group (0 [0-2] vs. 0 [0-0.55], p=0.001). Failure rate was not different between the two groups (1.5% vs. 4.4%, p=0.619). Adverse event-wise, slightly higher incidence of bleeding (1.5% vs. 5.9%, p=0.172) and teeth injury (0% vs. 5.9%, p=0.042) occurred in the Pentax group. Conclusion: In conclusion, Pentax-AWS provided better laryngeal view, similar intubation time and similar success rate compared with Macintosh laryngoscope in children with normal airway. However, the risk of teeth injury might increase and warrant special attention.

Keywords: Pentax-AWS, pediatric, video laryngoscope, intubation

Procedia PDF Downloads 197
16018 DeClEx-Processing Pipeline for Tumor Classification

Authors: Gaurav Shinde, Sai Charan Gongiguntla, Prajwal Shirur, Ahmed Hambaba

Abstract:

Health issues are significantly increasing, putting a substantial strain on healthcare services. This has accelerated the integration of machine learning in healthcare, particularly following the COVID-19 pandemic. The utilization of machine learning in healthcare has grown significantly. We introduce DeClEx, a pipeline that ensures that data mirrors real-world settings by incorporating Gaussian noise and blur and employing autoencoders to learn intermediate feature representations. Subsequently, our convolutional neural network, paired with spatial attention, provides comparable accuracy to state-of-the-art pre-trained models while achieving a threefold improvement in training speed. Furthermore, we provide interpretable results using explainable AI techniques. We integrate denoising and deblurring, classification, and explainability in a single pipeline called DeClEx.

Keywords: machine learning, healthcare, classification, explainability

Procedia PDF Downloads 49
16017 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: Gaelle Candel, David Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embeddings. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n²) to O(n²=k), and the memory requirement from n² to 2(n=k)², which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution, and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning

Procedia PDF Downloads 140
16016 Analyzing the Perception of Students and Faculty Members on Social Media Use in Academic Activities: A Case Study of Beijing Normal University

Authors: Mcjerry A. Bekoe, Emile Uwamahoro

Abstract:

Social media has become the order of the day, in particular among the youth. It is widely used both formally and informally in the university communities with varied definitions both in the academic circles and in the public domain. In simple terms, it is a media upon which social interactions are carried. In this work social media denote mobile phones, and web-base applications use by students and institutions to construct, partake, and distribute both existing and new information in a digital setting through internet communication. The basic aim of conducting this study was to analyze the perception of students and faculty members Beijing Normal University on social media use in the academic setting and to contribute to the understanding of how university students use social media, the advantages and disadvantages of social media in education. The study was qualitative and employed open-ended interview questions developed to seek students’ perception of the effects of social media and administered based on purposive sampling. Document analysis was also done because of triangulation to ensure validity and reliability. The results show there are positive and negative impacts of social media use depending on how one uses it. Social media have the capability to become a priceless asset to aid their educational communication.

Keywords: academics, high education, interactions, social media

Procedia PDF Downloads 333
16015 Designing a Motivated Tangible Multimedia System for Preschoolers

Authors: Kien Tsong Chau, Zarina Samsudin, Wan Ahmad Jaafar Wan Yahaya

Abstract:

The paper examined the capability of a prototype of a tangible multimedia system that was augmented with tangible objects in motivating young preschoolers in learning. Preschoolers’ learning behaviour is highly captivated and motivated by external physical stimuli. Hence, conventional multimedia which solely dependent on digital visual and auditory formats for knowledge delivery could potentially place them in inappropriate state of circumstances that are frustrating, boring, or worse, impede overall learning motivations. This paper begins by discussion with the objectives of the research, followed by research questions, hypotheses, ARCS model of motivation adopted in the process of macro-design, and the research instrumentation, Persuasive Multimedia Motivational Scale was deployed for measuring the level of motivation of subjects towards the experimental tangible multimedia. At the close, a succinct description of the findings of a relevant research is provided. In the research, a total of 248 preschoolers recruited from seven Malaysian kindergartens were examined. Analyses revealed that the tangible multimedia system improved preschoolers’ learning motivation significantly more than conventional multimedia. Overall, the findings led to the conclusion that the tangible multimedia system is a motivation conducive multimedia for preschoolers.

Keywords: tangible multimedia, preschoolers, multimedia, tangible objects

Procedia PDF Downloads 604
16014 Banking Crisis and Economic Effects of the Banking Crisis in Turkey

Authors: Sevilay Konya, Sadife Güngör, Zeynep Karaçor

Abstract:

Turkish economy is occurred depending on different factors from time to time and the banking crises of different magnitudes. Foremost among the factors which hinder the development of countries and societies- crises in the country's economy. Countries' economic growth rates affect inflation, unemployment and external trade. In this study, effect of November 2000, February 2001 and 2008 banking crisis on Turkey's economy and banking crisis will be examined and announced as conceptual. In this context, this study is investigates Turkey's GDP, inflation, unemployment and foreign trade figures. Turkey's economy affected have been identified from 2000 November 2001 February and 2008 banking crisis.

Keywords: banking crises, Turkey’s economy, economic effects, Turkey

Procedia PDF Downloads 292
16013 Cooperative Coevolution for Neuro-Evolution of Feed Forward Networks for Time Series Prediction Using Hidden Neuron Connections

Authors: Ravneil Nand

Abstract:

Cooperative coevolution uses problem decomposition methods to solve a larger problem. The problem decomposition deals with breaking down the larger problem into a number of smaller sub-problems depending on their method. Different problem decomposition methods have their own strengths and limitations depending on the neural network used and application problem. In this paper we are introducing a new problem decomposition method known as Hidden-Neuron Level Decomposition (HNL). The HNL method is competing with established problem decomposition method in time series prediction. The results show that the proposed approach has improved the results in some benchmark data sets when compared to the standalone method and has competitive results when compared to methods from literature.

Keywords: cooperative coevaluation, feed forward network, problem decomposition, neuron, synapse

Procedia PDF Downloads 325
16012 On a Theoretical Framework for Language Learning Apps Evaluation

Authors: Juan Manuel Real-Espinosa

Abstract:

This paper addresses the first step to evaluate language learning apps: what theoretical framework to adopt when designing the app evaluation framework. The answer is not just one since there are several options that could be proposed. However, the question to be clarified is to what extent the learning design of apps is based on a specific learning approach, or on the contrary, on a fusion of elements from several theoretical proposals and paradigms, such as m-learning, mobile assisted language learning, and a number of theories about language acquisition. The present study suggests that the reality is closer to the second assumption. This implies that the theoretical framework against which the learning design of the apps should be evaluated must also be a hybrid theoretical framework, which integrates evaluation criteria from the different theories involved in language learning through mobile applications.

Keywords: mobile-assisted language learning, action-oriented approach, apps evaluation, post-method pedagogy, second language acquisition

Procedia PDF Downloads 200
16011 Climate Adaptability of Vernacular Courtyards in Jiangnan Area, Southeast China

Authors: Yu Bingqing

Abstract:

Research on the meteorological observation data of conventional meteorological stations in Jiangnan area from 2001 to 2020 and digital elevation DEM, the "golden section" comfort index calculation method was used to refine the spatial estimation of climate comfort in Jiangnan area under undulating terrain on the Gis platform, and its spatiotemporal distribution characteristics in the region were analyzed. The results can provide reference for the development and utilization of climate resources in Jiangnan area.The results show that: ① there is a significant spatial difference between winter and summer climate comfort from low latitude to high latitude. ②There is a significant trend of decreasing climate comfort from low altitude to high altitude in winter, but the opposite is true in summer. ③There is a trend of decreasing climate comfort from offshore to inland in winter, but the difference is not significant in summer. The climate comfort level in the natural lake area is higher in summer than in the surrounding areas, but not in winter. ⑤ In winter and summer, altitude has the greatest influence on the difference in comfort level.

Keywords: vernacular courtyards, thermal environment, depth-to-height ratio, climate adaptability,Southeast China

Procedia PDF Downloads 55
16010 Optimizing Approach for Sifting Process to Solve a Common Type of Empirical Mode Decomposition Mode Mixing

Authors: Saad Al-Baddai, Karema Al-Subari, Elmar Lang, Bernd Ludwig

Abstract:

Empirical mode decomposition (EMD), a new data-driven of time-series decomposition, has the advantage of supposing that a time series is non-linear or non-stationary, as is implicitly achieved in Fourier decomposition. However, the EMD suffers of mode mixing problem in some cases. The aim of this paper is to present a solution for a common type of signals causing of EMD mode mixing problem, in case a signal suffers of an intermittency. By an artificial example, the solution shows superior performance in terms of cope EMD mode mixing problem comparing with the conventional EMD and Ensemble Empirical Mode decomposition (EEMD). Furthermore, the over-sifting problem is also completely avoided; and computation load is reduced roughly six times compared with EEMD, an ensemble number of 50.

Keywords: empirical mode decomposition (EMD), mode mixing, sifting process, over-sifting

Procedia PDF Downloads 389
16009 Construction of Graph Signal Modulations via Graph Fourier Transform and Its Applications

Authors: Xianwei Zheng, Yuan Yan Tang

Abstract:

Classical window Fourier transform has been widely used in signal processing, image processing, machine learning and pattern recognition. The related Gabor transform is powerful enough to capture the texture information of any given dataset. Recently, in the emerging field of graph signal processing, researchers devoting themselves to develop a graph signal processing theory to handle the so-called graph signals. Among the new developing theory, windowed graph Fourier transform has been constructed to establish a time-frequency analysis framework of graph signals. The windowed graph Fourier transform is defined by using the translation and modulation operators of graph signals, following the similar calculations in classical windowed Fourier transform. Specifically, the translation and modulation operators of graph signals are defined by using the Laplacian eigenvectors as follows. For a given graph signal, its translation is defined by a similar manner as its definition in classical signal processing. Specifically, the translation operator can be defined by using the Fourier atoms; the graph signal translation is defined similarly by using the Laplacian eigenvectors. The modulation of the graph can also be established by using the Laplacian eigenvectors. The windowed graph Fourier transform based on these two operators has been applied to obtain time-frequency representations of graph signals. Fundamentally, the modulation operator is defined similarly to the classical modulation by multiplying a graph signal with the entries in each Fourier atom. However, a single Laplacian eigenvector entry cannot play a similar role as the Fourier atom. This definition ignored the relationship between the translation and modulation operators. In this paper, a new definition of the modulation operator is proposed and thus another time-frequency framework for graph signal is constructed. Specifically, the relationship between the translation and modulation operations can be established by the Fourier transform. Specifically, for any signal, the Fourier transform of its translation is the modulation of its Fourier transform. Thus, the modulation of any signal can be defined as the inverse Fourier transform of the translation of its Fourier transform. Therefore, similarly, the graph modulation of any graph signal can be defined as the inverse graph Fourier transform of the translation of its graph Fourier. The novel definition of the graph modulation operator established a relationship of the translation and modulation operations. The new modulation operation and the original translation operation are applied to construct a new framework of graph signal time-frequency analysis. Furthermore, a windowed graph Fourier frame theory is developed. Necessary and sufficient conditions for constructing windowed graph Fourier frames, tight frames and dual frames are presented in this paper. The novel graph signal time-frequency analysis framework is applied to signals defined on well-known graphs, e.g. Minnesota road graph and random graphs. Experimental results show that the novel framework captures new features of graph signals.

Keywords: graph signals, windowed graph Fourier transform, windowed graph Fourier frames, vertex frequency analysis

Procedia PDF Downloads 336
16008 Scenarios of Societal Security and Business Continuity Cycles

Authors: Jiří F. Urbánek, Jiří Barta

Abstract:

Societal security, continuity scenarios, and methodological cycling approach understands in this article. Namely, societal security organizational challenges ask implementation of international standards BS 25999-2 and global ISO 22300 which is a family of standards for business continuity management system. Efficient global organization system is distinguished of high entity´s complexity, connectivity, and interoperability, having not only cooperative relations in a fact. Competing business have numerous participating ´enemies´, which are in apparent or hidden opponent and antagonistic roles with prosperous organization systems, resulting to a crisis scene or even to a battle theater. Organization business continuity scenarios are necessary for such ´a play´ preparedness, planning, management, and overmastering in real environments.

Keywords: business continuity, societal security, crisis scenarios cycles, interoperability

Procedia PDF Downloads 381
16007 Study of Biofuel Produced by Babassu Oil Fatty Acids Esterification

Authors: F. A. F. da Ponte, J. Q. Malveira, I. A. Maciel, M. C. G. Albuquerque

Abstract:

In this work aviation, biofuel production was studied by fatty acids (C6 to C16) esterification. The process variables in heterogeneous catalysis were evaluated using an experimental design. Temperature and reaction time were the studied parameters, and the methyl esters content was the response of the experimental design. An ion exchange resin was used as a heterogeneous catalyst. The process optimization was carried out using response surface methodology (RSM) and polynomial model of second order. Results show that the most influential variables on the linear coefficient of each effect studied were temperature and reaction time. The best result of methyl esters conversion in the experimental design was under the conditions: 10% wt of catalyst; 100 °C and 4 hours of reaction. The best-achieved conversion was 96.5% wt of biofuel.

Keywords: esterification, ion-exchange resins, response surface methodology, biofuel

Procedia PDF Downloads 489
16006 Research on the Evaluation and Delineation of Value Units of New Industrial Parks Based on Implementation-Orientation

Authors: Chengfang Wang, Zichao Wu, Jianying Zhou

Abstract:

At present, much attention is paid to the development of new industrial parks in the era of inventory planning. Generally speaking, there are two types of development models: incremental development models and stock development models. The former relies on key projects to build a value innovation park, and the latter relies on the iterative update of the park to build a value innovation park. Take the Baiyun Western Digital Park as an example, considering the growth model of value units, determine the evaluation target. Based on a GIS platform, comprehensive land-use status, regulatory detailed planning, land use planning, blue-green ecological base, rail transit system, road network system, industrial park distribution, public service facilities, and other factors are used to carry out the land use within the planning multi-factor superimposed comprehensive evaluation, constructing a value unit evaluation system, and delineating value units based on implementation orientation and combining two different development models. The research hopes to provide a reference for the planning and construction of new domestic industrial parks.

Keywords: value units, GIS, multi-factor evaluation, implementation orientation

Procedia PDF Downloads 183
16005 The Effect of a Multidisciplinary Spine Clinic on Treatment Rates and Lead Times to Care

Authors: Ishan Naidu, Jessica Ryvlin, Devin Videlefsky

Abstract:

Introduction: Back pain is a leading cause of years lived with disability and economic burden, exceeding over $20 billion in healthcare costs not including indirect costs such as absence from work and caregiving. The multifactorial nature of back pain leads to treatment modalities administered by a variety of specialists, which are often disjointed. Multiple studies have found that patients receiving delayed physical therapy for lower back pain had higher medical-related costs from increased health service utilization as well as a reduced improvement in pain severity compared to early management. Uncoordinated health care delivery can exacerbate the physical and economic toll of the chronic condition, thus improvements in interdisciplinary, shared decision-making may improve outcomes. Objective: To assess whether a multidisciplinary spine clinic (MSC), consisting of orthopedic surgery, neurosurgery, pain medicine, and physiatry, alters interventional and non-interventional planning and treatment compared to a traditional unidisciplinary spine clinic (USC) including only orthopedic surgery. Methods: We conducted a retrospective cohort study with patients initially presenting for spine care to orthopedic surgeons between July 1, 2018 to June 30, 2019. Time to treatment recommendation, time to treatment and rates of treatment recommendations were assessed, including physical therapy, injections and surgery. Treatment rates were compared between MSC and USC using Pearson’s chi-square test logistic regression. Time to treatment recommendation and time to treatment were compared using log-rank test and Cox proportional hazard regression. All analyses were repeated for the propensity score (PS) matched subsample. Results: This study included 1,764 patients, with 692 at MSC and 1,072 at USC. Patients in MSC were more likely to be recommended injection when compared to USC (8.5% vs. 5.4%, p=0.01). When adjusted for confounders, the likelihood of injection recommendation remained greater in MSC than USC (Odds ratio [OR]=2.22, 95% CI: (1.39, 3.53), p=0.001). MSC was also associated with a shorter time to receiving injection recommendation versus USC (median: 21 vs. 32 days, log-rank: p<0.001; hazard ratio [HR]=1.90, 95% CI: (1.25, 2.90), p=0.003). MSC was associated with a higher likelihood of injection treatment (OR=2.27, 95% CI: (1.39, 3.73), p=0.001) and shorter lead time (HR=1.98, 95% CI: (1.27, 3.09), p=0.003). PS-matched analyses yielded similar conclusions. Conclusions: Care delivered at a multidisciplinary spine clinic was associated with a higher likelihood of recommending injection and a shorter lead time to injection administration when compared to a traditional unidisciplinary spine surgery clinic. Multidisciplinary clinics may facilitate coordinated care amongst different specialties resulting in increased utilization of less invasive treatment modalities while also improving care efficiency. The multidisciplinary clinic model is an important advancement in care delivery and communication, which can be used as a powerful method of improving patient outcomes as treatment guidelines evolve.

Keywords: coordinated care, epidural steroid injection, multi-disciplinary, non-invasive

Procedia PDF Downloads 137