Search results for: minimum data set
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26553

Search results for: minimum data set

22803 Experimental Studies and CFD Predictions on Hydrodynamics of Gas-Solid Flow in an ICFB with a Draft Tube

Authors: Ravi Gujjula, Chinna Eranna, Narasimha Mangadoddy

Abstract:

Hydrodynamic study of gas and solid flow in an internally circulating fluidized bed with draft tube is made in this paper using high speed camera and pressure probes for the laboratory ICFB test rig 3.0 m X 2.7 m column having a draft tube located in the center of ICFB. Experiments were conducted using different sized sand particles with varying particle size distribution. At each experimental run the standard pressure-flow curves for both draft tube and annular region beds measured and the same time downward particles velocity in the annular bed region were also measured. The effect of superficial gas velocity, static bed height (40, 50 & 60 cm) and the draft tube gap height (10.5 & 14.5 cm) on pressure drop profiles, solid circulation pattern, and gas bypassing dynamics for the ICFB investigated extensively. The mechanism of governing solid recirculation and the pressure losses in an ICFB has been eluded based on gas and solid dynamics obtained from the experimental data. 3D ICFB CFD simulation runs conducted and extracted data validated with ICFB experimental data.

Keywords: icfb, cfd, pressure drop, solids recirculation, bed height, draft tube

Procedia PDF Downloads 516
22802 Domain Adaptive Dense Retrieval with Query Generation

Authors: Rui Yin, Haojie Wang, Xun Li

Abstract:

Recently, mainstream dense retrieval methods have obtained state-of-the-art results on some datasets and tasks. However, they require large amounts of training data, which is not available in most domains. The severe performance degradation of dense retrievers on new data domains has limited the use of dense retrieval methods to only a few domains with large training datasets. In this paper, we propose an unsupervised domain-adaptive approach based on query generation. First, a generative model is used to generate relevant queries for each passage in the target corpus, and then, the generated queries are used for mining negative passages. Finally, the query-passage pairs are labeled with a cross-encoder and used to train a domain-adapted dense retriever. We also explore contrastive learning as a method for training domain-adapted dense retrievers and show that it leads to strong performance in various retrieval settings. Experiments show that our approach is more robust than previous methods in target domains that require less unlabeled data.

Keywords: dense retrieval, query generation, contrastive learning, unsupervised training

Procedia PDF Downloads 104
22801 Design of a Graphical User Interface for Data Preprocessing and Image Segmentation Process in 2D MRI Images

Authors: Enver Kucukkulahli, Pakize Erdogmus, Kemal Polat

Abstract:

The 2D image segmentation is a significant process in finding a suitable region in medical images such as MRI, PET, CT etc. In this study, we have focused on 2D MRI images for image segmentation process. We have designed a GUI (graphical user interface) written in MATLABTM for 2D MRI images. In this program, there are two different interfaces including data pre-processing and image clustering or segmentation. In the data pre-processing section, there are median filter, average filter, unsharp mask filter, Wiener filter, and custom filter (a filter that is designed by user in MATLAB). As for the image clustering, there are seven different image segmentations for 2D MR images. These image segmentation algorithms are as follows: PSO (particle swarm optimization), GA (genetic algorithm), Lloyds algorithm, k-means, the combination of Lloyds and k-means, mean shift clustering, and finally BBO (Biogeography Based Optimization). To find the suitable cluster number in 2D MRI, we have designed the histogram based cluster estimation method and then applied to these numbers to image segmentation algorithms to cluster an image automatically. Also, we have selected the best hybrid method for each 2D MR images thanks to this GUI software.

Keywords: image segmentation, clustering, GUI, 2D MRI

Procedia PDF Downloads 377
22800 Reactive Analysis of Different Protocol in Mobile Ad Hoc Network

Authors: Manoj Kumar

Abstract:

Routing protocols have a central role in any mobile ad hoc network (MANET). There are many routing protocols that exhibit different performance levels in different scenarios. In this paper, we compare AODV, DSDV, DSR, and ZRP routing protocol in mobile ad hoc networks to determine the best operational conditions for each protocol. We analyze these routing protocols by extensive simulations in OPNET simulator and show how to pause time and the number of nodes affect their performance. In this study, performance is measured in terms of control traffic received, control traffic sent, data traffic received, sent data traffic, throughput, retransmission attempts.

Keywords: AODV, DSDV, DSR, ZRP

Procedia PDF Downloads 518
22799 Establishment of Landslide Warning System Using Surface or Sub-Surface Sensors Data

Authors: Neetu Tyagi, Sumit Sharma

Abstract:

The study illustrates the results of an integrated study done on Tangni landslide located on NH-58 at Chamoli, Uttarakhand. Geological, geo-morphological and geotechnical investigations were carried out to understand the mechanism of landslide and to plan further investigation and monitoring. At any rate, the movements were favored by continuous rainfall water infiltration from the zones where the phyllites/slates and Dolomites outcrop. The site investigations were carried out including the monitoring of landslide movements and of the water level fluctuations due to rainfall give us a better understanding of landslide dynamics that have been causing in time soil instability at Tangni landslide site. The Early Warning System (EWS) installed different types of sensors and all sensors were directly connected to data logger and raw data transfer to the Defence Terrain Research Laboratory (DTRL) server room with the help of File Transfer Protocol (FTP). The slip surfaces were found at depths ranging from 8 to 10 m from Geophysical survey and hence sensors were installed to the depth of 15m at various locations of landslide. Rainfall is the main triggering factor of landslide. In this study, the developed model of unsaturated soil slope stability is carried out. The analysis of sensors data available for one year, indicated the sliding surface of landslide at depth between 6 to 12m with total displacement up to 6cm per year recorded at the body of landslide. The aim of this study is to set the threshold and generate early warning. Local peoples already alert towards landslide, if they have any types of warning system.

Keywords: early warning system, file transfer protocol, geo-morphological, geotechnical, landslide

Procedia PDF Downloads 158
22798 Assessment of Heavy Metals in Irrigation Water Collected from Various Vegetables Growing Areas of Swat Valley

Authors: Islam Zeb

Abstract:

The water of poor quality used for irrigation purposes has the potential to be the direct source of contamination and a vehicle for spreading contamination in the field. A number of wide-ranging review articles have been published that highlight irrigation water as a source of heavy metals toxicity which leads to chronic diseases in the human body. Here a study was planned to determine the microbial and heavy metals status of irrigation water collected from various locations of district Swat in various months. The analyses were carried out at the Environmental Horticulture Laboratory, Department of Horticulture, The University of Agriculture Peshawar, during the year 2018 – 19. The experiment was laid out in Randomized Complete Block Design (RCBD) with two factors and three replicates. Factor A consist of different locations and factor B represent various months. The result of heavy metals concentration in different regions, maximum Lead, Cadmium, Chromium, Nickel and Copper (4.27, 0.56, 0.81, 1.33 and 1.51 mg L-1 respectively) were noted for the irrigation water samples collected from Mingora while minimum Lead, Cadmium, Chromium, Nickel and Copper concentration (2.59, 0.30, 0.27, 0.40 and 0.54 mg L-1 respectively) were noted for the samples of matta. Whereas results of heavy metals content in irrigation water samples for various months maximum content of Lead, Cadmium, Chromium, Nickel and Copper (4.56, 0.63, 1.15, 1.31 and 1.48 mg L-1 respectively) were noted for the samples collected in Jan/Feb while lowest values for Lead, Cadmium, Chromium, Nickel and Copper (2.38, 0.24, 0.21, 0.41 and 0.52 mg L-1 respectively) were noted in the samples of July/August. A significant interaction was found for all the studied parameters. It was concluded that the concentration of heavy metal was maximum in irrigation water samples collected from the Mingora location during the month of Jan/Feb because Mingora is the most polluted area as compared to other studied regions, whereas the water content in winter goes to freeze and mostly contaminated water is used for irrigation purposes.

Keywords: irrigation water, various months, different regions, heavy metals contamination, Swat

Procedia PDF Downloads 78
22797 Microchip-Integrated Computational Models for Studying Gait and Motor Control Deficits in Autism

Authors: Noah Odion, Honest Jimu, Blessing Atinuke Afuape

Abstract:

Introduction: Motor control and gait abnormalities are commonly observed in individuals with autism spectrum disorder (ASD), affecting their mobility and coordination. Understanding the underlying neurological and biomechanical factors is essential for designing effective interventions. This study focuses on developing microchip-integrated wearable devices to capture real-time movement data from individuals with autism. By applying computational models to the collected data, we aim to analyze motor control patterns and gait abnormalities, bridging a crucial knowledge gap in autism-related motor dysfunction. Methods: We designed microchip-enabled wearable devices capable of capturing precise kinematic data, including joint angles, acceleration, and velocity during movement. A cross-sectional study was conducted on individuals with ASD and a control group to collect comparative data. Computational modelling was applied using machine learning algorithms to analyse motor control patterns, focusing on gait variability, balance, and coordination. Finite element models were also used to simulate muscle and joint dynamics. The study employed descriptive and analytical methods to interpret the motor data. Results: The wearable devices effectively captured detailed movement data, revealing significant gait variability in the ASD group. For example, gait cycle time was 25% longer, and stride length was reduced by 15% compared to the control group. Motor control analysis showed a 30% reduction in balance stability in individuals with autism. Computational models successfully predicted movement irregularities and helped identify motor control deficits, particularly in the lower limbs. Conclusions: The integration of microchip-based wearable devices with computational models offers a powerful tool for diagnosing and treating motor control deficits in autism. These results have significant implications for patient care, providing objective data to guide personalized therapeutic interventions. The findings also contribute to the broader field of neuroscience by improving our understanding of the motor dysfunctions associated with ASD and other neurodevelopmental disorders.

Keywords: motor control, gait abnormalities, autism, wearable devices, microchips, computational modeling, kinematic analysis, neurodevelopmental disorders

Procedia PDF Downloads 24
22796 Radio Frequency Identification Device Based Emergency Department Critical Care Billing: A Framework for Actionable Intelligence

Authors: Shivaram P. Arunachalam, Mustafa Y. Sir, Andy Boggust, David M. Nestler, Thomas R. Hellmich, Kalyan S. Pasupathy

Abstract:

Emergency departments (EDs) provide urgent care to patients throughout the day in a complex and chaotic environment. Real-time location systems (RTLS) are increasingly being utilized in healthcare settings, and have shown to improve safety, reduce cost, and increase patient satisfaction. Radio Frequency Identification Device (RFID) data in an ED has been shown to compute variables such as patient-provider contact time, which is associated with patient outcomes such as 30-day hospitalization. These variables can provide avenues for improving ED operational efficiency. A major challenge with ED financial operations is under-coding of critical care services due to physicians’ difficulty reporting accurate times for critical care provided under Current Procedural Terminology (CPT) codes 99291 and 99292. In this work, the authors propose a framework to optimize ED critical care billing using RFID data. RFID estimated physician-patient contact times could accurately quantify direct critical care services which will help model a data-driven approach for ED critical care billing. This paper will describe the framework and provide insights into opportunities to prevent under coding as well as over coding to avoid insurance audits. Future work will focus on data analytics to demonstrate the feasibility of the framework described.

Keywords: critical care billing, CPT codes, emergency department, RFID

Procedia PDF Downloads 131
22795 Estimation of Service Quality and Its Impact on Market Share Using Business Analytics

Authors: Haritha Saranga

Abstract:

Service quality has become an important driver of competition in manufacturing industries of late, as many products are being sold in conjunction with service offerings. With increase in computational power and data capture capabilities, it has become possible to analyze and estimate various aspects of service quality at the granular level and determine their impact on business performance. In the current study context, dealer level, model-wise warranty data from one of the top two-wheeler manufacturers in India is used to estimate service quality of individual dealers and its impact on warranty related costs and sales performance. We collected primary data on warranty costs, number of complaints, monthly sales, type of quality upgrades, etc. from the two-wheeler automaker. In addition, we gathered secondary data on various regions in India, such as petrol and diesel prices, geographic and climatic conditions of various regions where the dealers are located, to control for customer usage patterns. We analyze this primary and secondary data with the help of a variety of analytics tools such as Auto-Regressive Integrated Moving Average (ARIMA), Seasonal ARIMA and ARIMAX. Study results, after controlling for a variety of factors, such as size, age, region of the dealership, and customer usage pattern, show that service quality does influence sales of the products in a significant manner. A more nuanced analysis reveals the dynamics between product quality and service quality, and how their interaction affects sales performance in the Indian two-wheeler industry context. We also provide various managerial insights using descriptive analytics and build a model that can provide sales projections using a variety of forecasting techniques.

Keywords: service quality, product quality, automobile industry, business analytics, auto-regressive integrated moving average

Procedia PDF Downloads 120
22794 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 409
22793 Relay-Augmented Bottleneck Throughput Maximization for Correlated Data Routing: A Game Theoretic Perspective

Authors: Isra Elfatih Salih Edrees, Mehmet Serdar Ufuk Türeli

Abstract:

In this paper, an energy-aware method is presented, integrating energy-efficient relay-augmented techniques for correlated data routing with the goal of optimizing bottleneck throughput in wireless sensor networks. The system tackles the dual challenge of throughput optimization while considering sensor network energy consumption. A unique routing metric has been developed to enable throughput maximization while minimizing energy consumption by utilizing data correlation patterns. The paper introduces a game theoretic framework to address the NP-complete optimization problem inherent in throughput-maximizing correlation-aware routing with energy limitations. By creating an algorithm that blends energy-aware route selection strategies with the best reaction dynamics, this framework provides a local solution. The suggested technique considerably raises the bottleneck throughput for each source in the network while reducing energy consumption by choosing the best routes that strike a compromise between throughput enhancement and energy efficiency. Extensive numerical analyses verify the efficiency of the method. The outcomes demonstrate the significant decrease in energy consumption attained by the energy-efficient relay-augmented bottleneck throughput maximization technique, in addition to confirming the anticipated throughput benefits.

Keywords: correlated data aggregation, energy efficiency, game theory, relay-augmented routing, throughput maximization, wireless sensor networks

Procedia PDF Downloads 82
22792 Exploring Behavioural Biases among Indian Investors: A Qualitative Inquiry

Authors: Satish Kumar, Nisha Goyal

Abstract:

In the stock market, individual investors exhibit different kinds of behaviour. Traditional finance is built on the notion of 'homo economics', which states that humans always make perfectly rational choices to maximize their wealth and minimize risk. That is, traditional finance has concern for how investors should behave rather than how actual investors are behaving. Behavioural finance provides the explanation for this phenomenon. Although finance has been studied for thousands of years, behavioural finance is an emerging field that combines the behavioural or psychological aspects with conventional economic and financial theories to provide explanations on how emotions and cognitive factors influence investors’ behaviours. These emotions and cognitive factors are known as behavioural biases. Because of these biases, investors make irrational investment decisions. Besides, the emotional and cognitive factors, the social influence of media as well as friends, relatives and colleagues also affect investment decisions. Psychological factors influence individual investors’ investment decision making, but few studies have used qualitative methods to understand these factors. The aim of this study is to explore the behavioural factors or biases that affect individuals’ investment decision making. For the purpose of this exploratory study, an in-depth interview method was used because it provides much more exhaustive information and a relaxed atmosphere in which people feel more comfortable to provide information. Twenty investment advisors having a minimum 5 years’ experience in securities firms were interviewed. In this study, thematic content analysis was used to analyse interview transcripts. Thematic content analysis process involves analysis of transcripts, coding and identification of themes from data. Based on the analysis we categorized the statements of advisors into various themes. Past market returns and volatility; preference for safe returns; tendency to believe they are better than others; tendency to divide their money into different accounts/assets; tendency to hold on to loss-making assets; preference to invest in familiar securities; tendency to believe that past events were predictable; tendency to rely on the reference point; tendency to rely on other sources of information; tendency to have regret for making past decisions; tendency to have more sensitivity towards losses than gains; tendency to rely on own skills; tendency to buy rising stocks with the expectation that this rise will continue etc. are some of the major concerns showed by experts about investors. The findings of the study revealed 13 biases such as overconfidence bias, disposition effect, familiarity bias, framing effect, anchoring bias, availability bias, self-attribution bias, representativeness, mental accounting, hindsight bias, regret aversion, loss aversion and herding bias/media biases present in Indian investors. These biases have a negative connotation because they produce a distortion in the calculation of an outcome. These biases are classified under three categories such as cognitive errors, emotional biases and social interaction. The findings of this study may assist both financial service providers and researchers to understand the various psychological biases of individual investors in investment decision making. Additionally, individual investors will also be aware of the behavioural biases that will aid them to make sensible and efficient investment decisions.

Keywords: financial advisors, individual investors, investment decisions, psychological biases, qualitative thematic content analysis

Procedia PDF Downloads 169
22791 CSR Reporting, State Ownership, and Corporate Performance in China: Proof from Longitudinal Data of Publicly Traded Enterprises from 2006 to 2020

Authors: Wanda Luen-Wun Siu, Xiaowen Zhang

Abstract:

This paper offered the primary methodical proof on how CSR reporting related to enterprise earnings in listed firms in China in light of most evidence focusing on cross-sectional data or data in a short span of time. Using full economic and business panel data on China’s publicly listed enterprise from 2006 to 2020 over two decades in the China Stock Market and Accounting Research database, we found initial evidence of significant direct relations between CSR reporting and firm corporate performance in both state-owned and privately owned firms over this period, supporting the stakeholder theory. Results also revealed that state-owned enterprises performed as well as private enterprises in the current period. But private enterprises performed better than state-owned enterprises in the subsequent years. Moreover, the release of social responsibility reports had a more significant impact on the financial performance of state-owned and private enterprises in the current period than in the subsequent periods. Specifically, CSR release was not significantly associated with the financial performance of state-owned enterprises on the lag of the first, second, and third periods. But it had an impact on the lag of the first, second, and third periods among private enterprises. Such findings suggested that CSR reporting helped improve the corporate financial performance of state-owned and private enterprises in the current period, but this kind of effect was more significant among private enterprises in the lag periods.

Keywords: China’s listed firms, CSR reporting, financial performance, panel analysis

Procedia PDF Downloads 167
22790 A 0-1 Goal Programming Approach to Optimize the Layout of Hospital Units: A Case Study in an Emergency Department in Seoul

Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee

Abstract:

This paper proposes a method to optimize the layout of an emergency department (ED) based on real executions of care processes by considering several planning objectives simultaneously. Recently, demand for healthcare services has been dramatically increased. As the demand for healthcare services increases, so do the need for new healthcare buildings as well as the need for redesign and renovating existing ones. The importance of implementation of a standard set of engineering facilities planning and design techniques has been already proved in both manufacturing and service industry with many significant functional efficiencies. However, high complexity of care processes remains a major challenge to apply these methods in healthcare environments. Process mining techniques applied in this study to tackle the problem of complexity and to enhance care process analysis. Process related information such as clinical pathways extracted from the information system of an ED. A 0-1 goal programming approach is then proposed to find a single layout that simultaneously satisfies several goals. The proposed model solved by optimization software CPLEX 12. The solution reached using the proposed method has 42.2% improvement in terms of walking distance of normal patients and 47.6% improvement in walking distance of critical patients at minimum cost of relocation. It has been observed that lots of patients must unnecessarily walk long distances during their visit to the emergency department because of an inefficient design. A carefully designed layout can significantly decrease patient walking distance and related complications.

Keywords: healthcare operation management, goal programming, facility layout problem, process mining, clinical processes

Procedia PDF Downloads 295
22789 Opportunities for Precision Feed in Apiculture

Authors: John Michael Russo

Abstract:

Honeybees are important to our food system and continue to suffer from high rates of colony loss. Precision feed has brought many benefits to livestock cultivation and these should transfer to apiculture. However, apiculture has unique challenges. The objective of this research is to understand how principles of precision agriculture, applied to apiculture and feed specifically, might effectively improve state-of-the-art cultivation. The methodology surveys apicultural practice to build a model for assessment. First, a review of apicultural motivators is made. Feed method is then evaluated. Finally, precision feed methods are examined as accelerants with potential to advance the effectiveness of feed practice. Six important motivators emerge: colony loss, disease, climate change, site variance, operational costs, and competition. Feed practice itself is used to compensate for environmental variables. The research finds that the current state-of-the-art in apiculture feed focuses on critical challenges in the management of feed schedules which satisfy requirements of the bees, preserve potency, optimize environmental variables, and manage costs. Many of the challenges are most acute when feed is used to dispense medication. Technology such as RNA treatments have even more rigorous demands. Precision feed solutions focus on strategies which accommodate specific needs of individual livestock. A major component is data; they integrate precise data with methods that respond to individual needs. There is enormous opportunity for precision feed to improve apiculture through the integration of precision data with policies to translate data into optimized action in the apiary, particularly through automation.

Keywords: precision agriculture, precision feed, apiculture, honeybees

Procedia PDF Downloads 78
22788 An Assessment of Different Blade Tip Timing (BTT) Algorithms Using an Experimentally Validated Finite Element Model Simulator

Authors: Mohamed Mohamed, Philip Bonello, Peter Russhard

Abstract:

Blade Tip Timing (BTT) is a technology concerned with the estimation of both frequency and amplitude of rotating blades. A BTT system comprises two main parts: (a) the arrival time measurement system, and (b) the analysis algorithms. Simulators play an important role in the development of the analysis algorithms since they generate blade tip displacement data from the simulated blade vibration under controlled conditions. This enables an assessment of the performance of the different algorithms with respect to their ability to accurately reproduce the original simulated vibration. Such an assessment is usually not possible with real engine data since there is no practical alternative to BTT for blade vibration measurement. Most simulators used in the literature are based on a simple spring-mass-damper model to determine the vibration. In this work, a more realistic experimentally validated simulator based on the Finite Element (FE) model of a bladed disc (blisk) is first presented. It is then used to generate the necessary data for the assessment of different BTT algorithms. The FE modelling is validated using both a hammer test and two firewire cameras for the mode shapes. A number of autoregressive methods, fitting methods and state-of-the-art inverse methods (i.e. Russhard) are compared. All methods are compared with respect to both synchronous and asynchronous excitations with both single and simultaneous frequencies. The study assesses the applicability of each method for different conditions of vibration, amount of sampling data, and testing facilities, according to its performance and efficiency under these conditions.

Keywords: blade tip timing, blisk, finite element, vibration measurement

Procedia PDF Downloads 311
22787 Blood Glucose Measurement and Analysis: Methodology

Authors: I. M. Abd Rahim, H. Abdul Rahim, R. Ghazali

Abstract:

There is numerous non-invasive blood glucose measurement technique developed by researchers, and near infrared (NIR) is the potential technique nowadays. However, there are some disagreements on the optimal wavelength range that is suitable to be used as the reference of the glucose substance in the blood. This paper focuses on the experimental data collection technique and also the analysis method used to analyze the data gained from the experiment. The selection of suitable linear and non-linear model structure is essential in prediction system, as the system developed need to be conceivably accurate.

Keywords: linear, near-infrared (NIR), non-invasive, non-linear, prediction system

Procedia PDF Downloads 461
22786 Preparation, Characterization, and Antimicrobial Activity of Carboxymethyl Chitosan Schiff Bases with Different Benzaldehyde Derivatives

Authors: Nadia A. Mohamed, Magdy W. Sabaa, Ahmed H. H. El-Ghandour, Marwa M. Abdel-Aziz, Omayma F. Abdel-Gawad

Abstract:

Eighteen carboxymethyl chitosan (CMCh) schiff bases and their reduced derivatives have been synthesized. They were characterized by spectral analyses (FT-IR and H1-NMR) and scanning electron microscopy observation. Their antibacterial activities against Streptococcus pneumoniae (RCMB 010010), Bacillis subtilis (RCMB 010067), as Gram positive bacteria and Escherichia coli (RCMB 010052) as Gram negative bacteria and the antifungal activity against Aspergillus fumigatus (RCMB 02568), Geotricum candidum (RCMB 05097), and Candida albicans (RCMB 05031) were examined using agar disk diffusion method. The results demonstrate how the antibacterial and the antifungal activity are clearly affected by both the nature and position of the substituent groups in the aryl ring of the prepared derivatives. CMCh-4-nitroBenz Schiff base and its reduced form show higher antimicrobial activity comparing with other para substituted derivatives. CMCh-4-nitroBenz Schiff base: 18.3, 17, and 15.6 mm against Bacillis subtilis, Streptococcus pneumonia, and Escherichia coli respectively and 16.2, 17.3, and 16.4 mm against Aspergillus fumigates, Geotricum candidum, and Candida albicans respectively. CMCh-4-nitroBenz reduced form: 19.5, 18.7, and 16.2 mm against Bacillis subtilis, Streptococcus pneumonia, and Escherichia coli respectively and 17.5, 19.5, and 17.4 mm against Aspergillus fumigates, Geotricum candidum, and Candida albicans respectively. Also CMCh-3-bromoBenz show good results; CMCh-3-bromoBenz schiff base: 19.2, 16.9, and 14.6 mm Bacillis subtilis, Streptococcus pneumonia, and Escherichia coli respectively and 18.4, 17.6, and 15.9 mm against Aspergillus fumigates, Geotricum candidum, and Candida albicans respectively.

Keywords: chitosan, schiff base, minimum inhibition concentration, antimicrobial activity

Procedia PDF Downloads 462
22785 Seasonal Assessment of Snow Cover Dynamics Based on Aerospace Multispectral Data on Livingston Island, South Shetland Islands in Antarctica and on Svalbard in Arctic

Authors: Temenuzhka Spasova, Nadya Yanakieva

Abstract:

Snow modulates the hydrological cycle and influences the functioning of ecosystems and is a significant resource for many populations whose water is harvested from cold regions. Snow observations are important for validating climate models. The accumulation and rapid melt of snow are two of the most dynamical seasonal environmental changes on the Earth’s surface. The actuality of this research is related to the modern tendencies of the remote sensing application in the solution of problems of different nature in the ecological monitoring of the environment. The subject of the study is the dynamic during the different seasons on Livingstone Island, South Shetland Islands in Antarctica and on Svalbard in Arctic. The objects were analyzed and mapped according to the Еuropean Space Agency data (ESA), acquired by sensors Sentinel-1 SAR (Synthetic Aperture Radar), Sentinel 2 MSI and GIS. Results have been obtained for changes in snow coverage during the summer-winter transition and its dynamics in the two hemispheres. The data used is of high time-spatial resolution, which is an advantage when looking at the snow cover. The MSI images are with different spatial resolution at the Earth surface range. The changes of the environmental objects are shown with the SAR images and different processing approaches. The results clearly show that snow and snow melting can be best registered by using SAR data via hh- horizontal polarization. The effect of the researcher on aerospace data and technology enables us to obtain different digital models, structuring and analyzing results excluding the subjective factor. Because of the large extent of terrestrial snow coverage and the difficulties in obtaining ground measurements over cold regions, remote sensing and GIS represent an important tool for studying snow areas and properties from regional to global scales.

Keywords: climate changes, GIS, remote sensing, SAR images, snow coverage

Procedia PDF Downloads 219
22784 Disclosure of Financial Risk on Sharia Banks in Indonesia

Authors: Renny Wulandari

Abstract:

This study aims to determine how the influence of Non Performing Financing, Financing Deposit Ratio, Operating Expenses and Operating Revenue and Net Income Margin on the disclosure of financial risk in Sharia banks. To achieve these objectives conducted associative research method with data source in the form of secondary data that is annual report data with period 2013-2016. The population in this study is the sharia banking industry in Indonesia and who issued the annual financial statements. A method of sampling use probability sampling. Analysis in this research is with SEM-PLS. The result is Net Income Margin has a significant effect on financial risk disclosure while Non Performing Financing (NPF) Financing to Deposit Ratio (FDR), Operating Expenses and Operating Revenue (OEOR) have no effect on the disclosure of financial risk in sharia bank.

Keywords: Sharia banks, disclosure of risk financial, non performing financing, financing deposit ratio, operating expenses and operating revenue, net income margin

Procedia PDF Downloads 234
22783 Model Observability – A Monitoring Solution for Machine Learning Models

Authors: Amreth Chandrasehar

Abstract:

Machine Learning (ML) Models are developed and run in production to solve various use cases that help organizations to be more efficient and help drive the business. But this comes at a massive development cost and lost business opportunities. According to the Gartner report, 85% of data science projects fail, and one of the factors impacting this is not paying attention to Model Observability. Model Observability helps the developers and operators to pinpoint the model performance issues data drift and help identify root cause of issues. This paper focuses on providing insights into incorporating model observability in model development and operationalizing it in production.

Keywords: model observability, monitoring, drift detection, ML observability platform

Procedia PDF Downloads 112
22782 The Contribution of Sanitation Practices to Marine Pollution and the Prevalence of Water-Borne Diseases in Prampram Coastal Area, Greater Accra-Ghana

Authors: Precious Roselyn Obuobi

Abstract:

Background: In Ghana, water-borne diseases remain a public health concern due to its impact. While marine pollution has been linked to outbreak of diseases especially in communities along the coast, associated risks such as oil spillage, marine debris, erosion, improper waste disposal and management practices persist. Objective: The study seeks to investigate sanitation practices that contribute to marine pollution in Prampram and the prevalence of selected water-borne diseases (diarrhea and typhoid fever). Method: This study used a descriptive cross-sectional design, employing the mix-method (qualitative and quantitative) approach. Twenty-two (22) participants were selected and semistructured questionnaire were administered to them. Additionally, interviews were conducted to collect more information. Further, an observation check-list was used to aid the data collection process. Secondary data comprising information on water-borne diseases in the district was acquired from the district health directorate to determine the prevalence of selected water-borne diseases in the community. Data Analysis: The qualitative data was analyzed using NVIVO® software by adapting the six steps thematic analysis by Braun and Clarke whiles STATA® version 16 was used to analyze the secondary data collected from the district health directorate. A descriptive statistic employed using mean, standard deviation, frequencies and proportions were used to summarize the results. Results: The results showed that open defecation and indiscriminate waste disposal were the main practices contributing to marine pollution in Prampram and its effect on public health. Conclusion: These findings have implications on public health and the environment, thus effort needs to be stepped up in educating the community on best sanitation practices.

Keywords: environment, sanitation, marine pollution, water-borne diseases

Procedia PDF Downloads 76
22781 A Study on Vulnerability of Alahsa Governorate to Generate Urban Heat Islands

Authors: Ilham S. M. Elsayed

Abstract:

The purpose of this study is to investigate Alahsa Governorate status and its vulnerability to generate urban heat islands. Alahsa Governorate is a famous oasis in the Arabic Peninsula including several oil centers. Extensive literature review was done to collect previous relative data on the urban heat island of Alahsa Governorate. Data used for the purpose of this research were collected from authorized bodies who control weather station networks over Alahsa Governorate, Eastern Province, Saudi Arabia. Although, the number of weather station networks within the region is very limited and the analysis using GIS software and its techniques is difficult and limited, the data analyzed confirm an increase in temperature for more than 2 °C from 2004 to 2014. Such increase is considerable whenever human health and comfort are the concern. The increase of temperature within one decade confirms the availability of urban heat islands. The study concludes that, Alahsa Governorate is vulnerable to create urban heat islands and more attention should be drawn to strategic planning of the governorate that is developing with a high pace and considerable increasing levels of urbanization.

Keywords: Alahsa Governorate, population density, Urban Heat Island, weather station

Procedia PDF Downloads 250
22780 Technological Development of a Biostimulant Bioproduct for Fruit Seedlings: An Engineering Overview

Authors: Andres Diaz Garcia

Abstract:

The successful technological development of any bioproduct, including those of the biostimulant type, requires to adequately completion of a series of stages allied to different disciplines that are related to microbiological, engineering, pharmaceutical chemistry, legal and market components, among others. Engineering as a discipline has a key contribution in different aspects of fermentation processes such as the design and optimization of culture media, the standardization of operating conditions within the bioreactor and the scaling of the production process of the active ingredient that it will be used in unit operations downstream. However, all aspects mentioned must take into account many biological factors of the microorganism such as the growth rate, the level of assimilation to various organic and inorganic sources and the mechanisms of action associated with its biological activity. This paper focuses on the practical experience within the Colombian Corporation for Agricultural Research (Agrosavia), which led to the development of a biostimulant bioproduct based on native rhizobacteria Bacillus amyloliquefaciens, oriented mainly to plant growth promotion in cape gooseberry nurseries and fruit crops in Colombia, and the challenges that were overcome from the expertise in the area of engineering. Through the application of strategies and engineering tools, a culture medium was optimized to obtain concentrations higher than 1E09 CFU (colony form units)/ml in liquid fermentation, the process of biomass production was standardized and a scale-up strategy was generated based on geometric (H/D of bioreactor relationships), and operational criteria based on a minimum dissolved oxygen concentration and that took into account the differences in the capacity of control of the process in the laboratory and pilot scales. Currently, the bioproduct obtained through this technological process is in stages of registration in Colombia for cape gooseberry fruits for export.

Keywords: biochemical engineering, liquid fermentation, plant growth promoting, scale-up process

Procedia PDF Downloads 112
22779 The Impact of Agricultural Product Export on Income and Employment in Thai Economy

Authors: Anucha Wittayakorn-Puripunpinyoo

Abstract:

The research objectives were 1) to study the situation and its trend of agricultural product export of Thailand 2) to study the impact of agricultural product export on income of Thai economy 3) the impact of agricultural product export on employment of Thai economy and 4) to find out the recommendations of agricultural product export policy of Thailand. In this research, secondary data were collected as yearly time series data from 1990 to 2016 accounted for 27 years. Data were collected from the Bank of Thailand database. Primary data were collected from the steakholders of agricultural product export policy of Thailand. Data analysis was applied descriptive statistics such as arithmetic mean, standard deviation. The forecasting of agricultural product was applied Mote Carlo Simulation technique as well as time trend analysis. In addition, the impact of agricultural product export on income and employment by applying econometric model while the estimated parameters were utilized the ordinary least square technique. The research results revealed that 1) agricultural product export value of Thailand from 1990 to 2016 was 338,959.5 Million Thai baht with its growth rate of 4.984 percent yearly, in addition, the forecasting of agricultural product export value of Thailand has increased but its growth rate has been declined 2) the impact of agricultural product export has positive impact on income in Thai economy, increasing in agricultural product export of Thailand by 1 percent would lead income increased by 0.0051 percent 3) the impact of agricultural product export has positive impact on employment in Thai economy, increasing in agricultural product export of Thailand by 1 percent would lead income increased by 0.079 percent and 4) in the future, agricultural product export policy would focused on finished or semi-finished agricultural product instead of raw material by applying technology and innovation in to make value added of agricultural product export. The public agricultural product export policy would support exporters in private sector in order to encourage them as agricultural exporters in Thailand.

Keywords: agricultural product export, income, employment, Thai economy

Procedia PDF Downloads 309
22778 Seafloor and Sea Surface Modelling in the East Coast Region of North America

Authors: Magdalena Idzikowska, Katarzyna Pająk, Kamil Kowalczyk

Abstract:

Seafloor topography is a fundamental issue in geological, geophysical, and oceanographic studies. Single-beam or multibeam sonars attached to the hulls of ships are used to emit a hydroacoustic signal from transducers and reproduce the topography of the seabed. This solution provides relevant accuracy and spatial resolution. Bathymetric data from ships surveys provides National Centers for Environmental Information – National Oceanic and Atmospheric Administration. Unfortunately, most of the seabed is still unidentified, as there are still many gaps to be explored between ship survey tracks. Moreover, such measurements are very expensive and time-consuming. The solution is raster bathymetric models shared by The General Bathymetric Chart of the Oceans. The offered products are a compilation of different sets of data - raw or processed. Indirect data for the development of bathymetric models are also measurements of gravity anomalies. Some forms of seafloor relief (e.g. seamounts) increase the force of the Earth's pull, leading to changes in the sea surface. Based on satellite altimetry data, Sea Surface Height and marine gravity anomalies can be estimated, and based on the anomalies, it’s possible to infer the structure of the seabed. The main goal of the work is to create regional bathymetric models and models of the sea surface in the area of the east coast of North America – a region of seamounts and undulating seafloor. The research includes an analysis of the methods and techniques used, an evaluation of the interpolation algorithms used, model thickening, and the creation of grid models. Obtained data are raster bathymetric models in NetCDF format, survey data from multibeam soundings in MB-System format, and satellite altimetry data from Copernicus Marine Environment Monitoring Service. The methodology includes data extraction, processing, mapping, and spatial analysis. Visualization of the obtained results was carried out with Geographic Information System tools. The result is an extension of the state of the knowledge of the quality and usefulness of the data used for seabed and sea surface modeling and knowledge of the accuracy of the generated models. Sea level is averaged over time and space (excluding waves, tides, etc.). Its changes, along with knowledge of the topography of the ocean floor - inform us indirectly about the volume of the entire water ocean. The true shape of the ocean surface is further varied by such phenomena as tides, differences in atmospheric pressure, wind systems, thermal expansion of water, or phases of ocean circulation. Depending on the location of the point, the higher the depth, the lower the trend of sea level change. Studies show that combining data sets, from different sources, with different accuracies can affect the quality of sea surface and seafloor topography models.

Keywords: seafloor, sea surface height, bathymetry, satellite altimetry

Procedia PDF Downloads 80
22777 Mobi-DiQ: A Pervasive Sensing System for Delirium Risk Assessment in Intensive Care Unit

Authors: Subhash Nerella, Ziyuan Guan, Azra Bihorac, Parisa Rashidi

Abstract:

Intensive care units (ICUs) provide care to critically ill patients in severe and life-threatening conditions. However, patient monitoring in the ICU is limited by the time and resource constraints imposed on healthcare providers. Many critical care indices such as mobility are still manually assessed, which can be subjective, prone to human errors, and lack granularity. Other important aspects, such as environmental factors, are not monitored at all. For example, critically ill patients often experience circadian disruptions due to the absence of effective environmental “timekeepers” such as the light/dark cycle and the systemic effect of acute illness on chronobiologic markers. Although the occurrence of delirium is associated with circadian disruption risk factors, these factors are not routinely monitored in the ICU. Hence, there is a critical unmet need to develop systems for precise and real-time assessment through novel enabling technologies. We have developed the mobility and circadian disruption quantification system (Mobi-DiQ) by augmenting biomarker and clinical data with pervasive sensing data to generate mobility and circadian cues related to mobility, nightly disruptions, and light and noise exposure. We hypothesize that Mobi-DiQ can provide accurate mobility and circadian cues that correlate with bedside clinical mobility assessments and circadian biomarkers, ultimately important for delirium risk assessment and prevention. The collected multimodal dataset consists of depth images, Electromyography (EMG) data, patient extremity movement captured by accelerometers, ambient light levels, Sound Pressure Level (SPL), and indoor air quality measured by volatile organic compounds, and the equivalent CO₂ concentration. For delirium risk assessment, the system recognizes mobility cues (axial body movement features and body key points) and circadian cues, including nightly disruptions, ambient SPL, and light intensity, as well as other environmental factors such as indoor air quality. The Mobi-DiQ system consists of three major components: the pervasive sensing system, a data storage and analysis server, and a data annotation system. For data collection, six local pervasive sensing systems were deployed, including a local computer and sensors. A video recording tool with graphical user interface (GUI) developed in python was used to capture depth image frames for analyzing patient mobility. All sensor data is encrypted, then automatically uploaded to the Mobi-DiQ server through a secured VPN connection. Several data pipelines are developed to automate the data transfer, curation, and data preparation for annotation and model training. The data curation and post-processing are performed on the server. A custom secure annotation tool with GUI was developed to annotate depth activity data. The annotation tool is linked to the MongoDB database to record the data annotation and to provide summarization. Docker containers are also utilized to manage services and pipelines running on the server in an isolated manner. The processed clinical data and annotations are used to train and develop real-time pervasive sensing systems to augment clinical decision-making and promote targeted interventions. In the future, we intend to evaluate our system as a clinical implementation trial, as well as to refine and validate it by using other data sources, including neurological data obtained through continuous electroencephalography (EEG).

Keywords: deep learning, delirium, healthcare, pervasive sensing

Procedia PDF Downloads 93
22776 Influence of a Company’s Dynamic Capabilities on Its Innovation Capabilities

Authors: Lovorka Galetic, Zeljko Vukelic

Abstract:

The advanced concepts of strategic and innovation management in the sphere of company dynamic and innovation capabilities, and achieving their mutual alignment and a synergy effect, are important elements in business today. This paper analyses the theory and empirically investigates the influence of a company’s dynamic capabilities on its innovation capabilities. A new multidimensional model of dynamic capabilities is presented, consisting of five factors appropriate to real time requirements, while innovation capabilities are considered pursuant to the official OECD and Eurostat standards. After examination of dynamic and innovation capabilities indicated their theoretical links, the empirical study testing the model and examining the influence of a company’s dynamic capabilities on its innovation capabilities showed significant results. In the study, a research model was posed to relate company dynamic and innovation capabilities. One side of the model features the variables that are the determinants of dynamic capabilities defined through their factors, while the other side features the determinants of innovation capabilities pursuant to the official standards. With regard to the research model, five hypotheses were set. The study was performed in late 2014 on a representative sample of large and very large Croatian enterprises with a minimum of 250 employees. The research instrument was a questionnaire administered to company top management. For both variables, the position of the company was tested in comparison to industry competitors, on a fivepoint scale. In order to test the hypotheses, correlation tests were performed to determine whether there is a correlation between each individual factor of company dynamic capabilities with the existence of its innovation capabilities, in line with the research model. The results indicate a strong correlation between a company’s possession of dynamic capabilities in terms of their factors, due to the new multi-dimensional model presented in this paper, with its possession of innovation capabilities. Based on the results, all five hypotheses were accepted. Ultimately, it was concluded that there is a strong association between the dynamic and innovation capabilities of a company. 

Keywords: dynamic capabilities, innovation capabilities, competitive advantage, business results

Procedia PDF Downloads 305
22775 Delineation of the Geoelectric and Geovelocity Parameters in the Basement Complex of Northwestern Nigeria

Authors: M. D. Dogara, G. C. Afuwai, O. O. Esther, A. M. Dawai

Abstract:

The geology of Northern Nigeria is under intense investigation particularly that of the northwest believed to be of the basement complex. The variability of the lithology is consistently inconsistent. Hence, the need for a close range study, it is, in view of the above that, two geophysical techniques, the vertical electrical sounding employing the Schlumberger array and seismic refraction methods, were used to delineate the geoelectric and geovelocity parameters of the basement complex of northwestern Nigeria. A total area of 400,000 m² was covered with sixty geoelectric stations established and sixty sets of seismic refraction data collected using the forward and reverse method. From the interpretation of the resistivity data, it is suggestive that the area is underlain by not more than five geoelectric layers of varying thicknesses and resistivities when a maximum half electrode spread of 100m was used. The result of the interpreted seismic data revealed two geovelocity layers, with velocities ranging between 478m/s to 1666m/s for the first layer and 1166m/s to 7141m/s for the second layer. The results of the two techniques, suggests that the area of study has an undulating bedrock topography with geoeletric and geovelocity layers composed of weathered rock materials.

Keywords: basement complex, delineation, geoelectric, geovelocity, Nigeria

Procedia PDF Downloads 151
22774 Adult Language Learning in the Institute of Technology Sector in the Republic of Ireland

Authors: Una Carthy

Abstract:

A recent study of third level institutions in Ireland reveals that both age and aptitude can be overcome by teaching methodologies to motivate second language learners. This PhD investigation gathered quantitative and qualitative data from 14 Institutes of Technology over a three years period from 2011 to 2014. The fundamental research question was to establish the impact of institutional language policy on attitudes towards language learning. However, other related issues around second language acquisition arose in the course of the investigation. Data were collected from both lectures and students, allowing interesting points of comparison to emerge from both datasets. Negative perceptions among lecturers regarding language provision were often associated with the view that language learning belongs to primary and secondary level and has no place in third level education. This perception was offset by substantial data showing positive attitudes towards adult language learning. Lenneberg’s Critical Age Theory postulated that the optimum age for learning a second language is before puberty. More recently, scholars have challenged this theory in their studies, revealing that mature learners can and do succeed at learning languages. With regard to aptitude, a preoccupation among lecturers regarding poor literacy skills among students emerged and was often associated with resistance to second language acquisition. This was offset by a preponderance of qualitative data from students highlighting the crucial role which teaching approaches play in the learning process. Interestingly, the data collected regarding learning disabilities reveals that, given the appropriate learning environments, individuals can be motivated to acquire second languages, and indeed succeed at learning them. These findings are in keeping with other recent studies regarding attitudes towards second language learning among students with learning disabilities. Both sets of findings reinforce the case for language policies in the Institute of Technology (IoTs). Supportive and positive learning environments can be created in third level institutions to motivate adult learners, thereby overcoming perceived obstacles relating to age and aptitude.

Keywords: age, aptitude, second language acquisition, teaching methodologies

Procedia PDF Downloads 123