Search results for: electronic data interchange
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26311

Search results for: electronic data interchange

23821 Impact of Protean Career Attitude on Career Success with the Mediating Effect of Career Insight

Authors: Prabhashini Wijewantha

Abstract:

This study looks at the impact of protean career attitude of employees on their career success and next it looks at the mediation effect of career insights on the above relationship. Career success is defined as the accomplishment of desirable work related outcomes at any point in person’s work experiences over time and it comprises of two sub variables, namely, career satisfaction and perceived employability. Protean career attitude was measured using the eight items from the Self Directedness subscale of the Protean Career Attitude scale developed by Briscoe and Hall, where as career satisfaction was measured by the three item scale developed by Martine, Eddleston, and Veiga. Perceived employability was also evaluated using three items and career insight was measured using fourteen items that were adapted and used by De Vos and Soens. Data were collected from a sample of 300 mid career executives in Sri Lanka deploying the survey strategy and data were analyzed using the SPSS and AMOS software version 20.0. A preliminary analysis of data was initially performed where data were screened and reliability and validity were ensured. Next a simple regression analysis was performed to test the direct impact of protean career attitude on career success and the hypothesis was supported. The Baron and Kenney’s four steps, three regressions approach for mediator testing was used to calculate the mediation effect of career insight on the above relationship and a partial mediation was supported by the data. Finally theoretical and practical implications are discussed.

Keywords: career success, career insight, mid career MBAs, protean career attitude

Procedia PDF Downloads 360
23820 Studying the Influence of Systematic Pre-Occupancy Data Collection through Post-Occupancy Evaluation: A Shift in the Architectural Design Process

Authors: Noor Abdelhamid, Donovan Nelson, Cara Prosser

Abstract:

The architectural design process could be mapped out as a dialogue between designer and user that is constructed across multiple phases with the overarching goal of aligning design outcomes with user needs. Traditionally, this dialogue is bounded within a preliminary phase of determining factors that will direct the design intent, and a completion phase, of handing off the project to the client. Pre- and post-occupancy evaluations (P/POE’s) could provide an alternative process by extending this dialogue on both ends of the design process. The purpose of this research is to study the influence of systematic pre-occupancy data collection in achieving design goals by conducting post-occupancy evaluations of two case studies. In the context of this study, systematic pre-occupancy data collection is defined as the preliminary documentation of the existing conditions that helps portray stakeholders’ needs. When implemented, pre-occupancy occurs during the early phases of the architectural design process, utilizing the information to shape the design intent. Investigative POE’s are performed on two case studies with distinct early design approaches to understand how the current space is impacting user needs, establish design outcomes, and inform future strategies. The first case study underwent systematic pre-occupancy data collection and synthesis, while the other represents the traditional, uncoordinated practice of informally collecting data during an early design phase. POE’s target the dynamics between the building and its occupants by studying how spaces are serving the needs of the users. Data collection for this study consists of user surveys, audiovisual materials, and observations during regular site visits. Mixed methods of qualitative and quantitative analyses are synthesized to identify patterns in the data. The paper concludes by positioning value on both sides of the architectural design process: the integration of systematic pre-occupancy methods in the early phases and the reinforcement of a continued dialogue between building and design team after building completion.

Keywords: architecture, design process, pre-occupancy data, post-occupancy evaluation

Procedia PDF Downloads 164
23819 An Analysis of Oil Price Changes and Other Factors Affecting Iranian Food Basket: A Panel Data Method

Authors: Niloofar Ashktorab, Negar Ashktorab

Abstract:

Oil exports fund nearly half of Iran’s government expenditures, since many years other countries have been imposed different sanctions against Iran. Sanctions that primarily target Iran’s key energy sector have harmed Iran’s economy. The strategic effects of sanctions might be reduction as Iran adjusts to them economically. In this study, we evaluate the impact of oil price and sanctions against Iran on food commodity prices by using panel data method. Here, we find that the food commodity prices, the oil price and real exchange rate are stationary. The results show positive effect of oil price changes, real exchange rate and sanctions on food commodity prices.

Keywords: oil price, food basket, sanctions, panel data, Iran

Procedia PDF Downloads 356
23818 A Proposed Framework for Software Redocumentation Using Distributed Data Processing Techniques and Ontology

Authors: Laila Khaled Almawaldi, Hiew Khai Hang, Sugumaran A. l. Nallusamy

Abstract:

Legacy systems are crucial for organizations, but their intricacy and lack of documentation pose challenges for maintenance and enhancement. Redocumentation of legacy systems is vital for automatically or semi-automatically creating documentation for software lacking sufficient records. It aims to enhance system understandability, maintainability, and knowledge transfer. However, existing redocumentation methods need improvement in data processing performance and document generation efficiency. This stems from the necessity to efficiently handle the extensive and complex code of legacy systems. This paper proposes a method for semi-automatic legacy system re-documentation using semantic parallel processing and ontology. Leveraging parallel processing and ontology addresses current challenges by distributing the workload and creating documentation with logically interconnected data. The paper outlines challenges in legacy system redocumentation and suggests a method of redocumentation using parallel processing and ontology for improved efficiency and effectiveness.

Keywords: legacy systems, redocumentation, big data analysis, parallel processing

Procedia PDF Downloads 46
23817 Application of Bioreactors in Regenerative Dentistry: Literature Review

Authors: Neeraj Malhotra

Abstract:

Background: Bioreactors in tissue engineering are used as devices that apply mechanical means to influence biological processes. They are commonly employed for stem cell culturing, growth and expansion as well as in 3D tissue culture. Contemporarily there use is well established and is tested extensively in the medical sciences, for tissue-regeneration and tissue engineering of organs like bone, cartilage, blood vessels, skin grafts, cardiac muscle etc. Methodology: Literature search, both electronic and hand search, was done using the following MeSH and keywords: bioreactors, bioreactors and dentistry, bioreactors & dental tissue engineering, bioreactors and regenerative dentistry. Articles published only in English language were included for review. Results: Bioreactors like, spinner flask-, rotating wall-, flow perfusion-, and micro-bioreactors and in-vivo bioreactor have been employed and tested for the regeneration of dental and like-tissues. These include gingival tissue, periodontal ligament, alveolar bone, mucosa, cementum and blood vessels. Based on their working dynamics they can be customized in future for regeneration of pulp tissue and whole tooth regeneration. Apart from this, they have been successfully used in testing the clinical efficacy and biological safety of dental biomaterials. Conclusion: Bioreactors have potential use in testing dental biomaterials and tissue engineering approaches aimed at regenerative dentistry.

Keywords: bioreactors, biological process, mechanical stimulation, regenerative dentistry, stem cells

Procedia PDF Downloads 209
23816 Armenian Refugees in Early 20th C Japan: Quantitative Analysis on Their Number Based on Japanese Historical Data with the Comparison of a Foreign Historical Data

Authors: Meline Mesropyan

Abstract:

At the beginning of the 20th century, Japan served as a transit point for Armenian refugees fleeing the 1915 Genocide. However, research on Armenian refugees in Japan is sparse, and the Armenian Diaspora has never taken root in Japan. Consequently, Japan has not been considered a relevant research site for studying Armenian refugees. The primary objective of this study is to shed light on the number of Armenian refugees who passed through Japan between 1915 and 1930. Quantitative analyses will be conducted based on newly uncovered Japanese archival documents. Subsequently, the Japanese data will be compared to American immigration data to estimate the potential number of refugees in Japan during that period. This under-researched area is relevant to both the Armenian Diaspora and refugee studies in Japan. By clarifying the number of refugees, this study aims to enhance understanding of Japan's treatment of refugees and the extent of humanitarian efforts conducted by organizations and individuals in Japan, contributing to the broader field of historical refugee studies.

Keywords: Armenian genocide, Armenian refugees, Japanese statistics, number of refugees

Procedia PDF Downloads 57
23815 Building Green Infrastructure Networks Based on Cadastral Parcels Using Network Analysis

Authors: Gon Park

Abstract:

Seoul in South Korea established the 2030 Seoul City Master Plan that contains green-link projects to connect critical green areas within the city. However, the plan does not have detailed analyses for green infrastructure to incorporate land-cover information to many structural classes. This study maps green infrastructure networks of Seoul for complementing their green plans with identifying and raking green areas. Hubs and links of main elements of green infrastructure have been identified from incorporating cadastral data of 967,502 parcels to 135 of land use maps using geographic information system. Network analyses were used to rank hubs and links of a green infrastructure map with applying a force-directed algorithm, weighted values, and binary relationships that has metrics of density, distance, and centrality. The results indicate that network analyses using cadastral parcel data can be used as the framework to identify and rank hubs, links, and networks for the green infrastructure planning under a variable scenarios of green areas in cities.

Keywords: cadastral data, green Infrastructure, network analysis, parcel data

Procedia PDF Downloads 206
23814 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms

Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao

Abstract:

Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.

Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50

Procedia PDF Downloads 140
23813 The Effect of CPU Location in Total Immersion of Microelectronics

Authors: A. Almaneea, N. Kapur, J. L. Summers, H. M. Thompson

Abstract:

Meeting the growth in demand for digital services such as social media, telecommunications, and business and cloud services requires large scale data centres, which has led to an increase in their end use energy demand. Generally, over 30% of data centre power is consumed by the necessary cooling overhead. Thus energy can be reduced by improving the cooling efficiency. Air and liquid can both be used as cooling media for the data centre. Traditional data centre cooling systems use air, however liquid is recognised as a promising method that can handle the more densely packed data centres. Liquid cooling can be classified into three methods; rack heat exchanger, on-chip heat exchanger and full immersion of the microelectronics. This study quantifies the improvements of heat transfer specifically for the case of immersed microelectronics by varying the CPU and heat sink location. Immersion of the server is achieved by filling the gap between the microelectronics and a water jacket with a dielectric liquid which convects the heat from the CPU to the water jacket on the opposite side. Heat transfer is governed by two physical mechanisms, which is natural convection for the fixed enclosure filled with dielectric liquid and forced convection for the water that is pumped through the water jacket. The model in this study is validated with published numerical and experimental work and shows good agreement with previous work. The results show that the heat transfer performance and Nusselt number (Nu) is improved by 89% by placing the CPU and heat sink on the bottom of the microelectronics enclosure.

Keywords: CPU location, data centre cooling, heat sink in enclosures, immersed microelectronics, turbulent natural convection in enclosures

Procedia PDF Downloads 272
23812 A Macroeconomic Analysis of Defense Industry: Comparisons, Trends and Improvements in Brazil and in the World

Authors: J. Fajardo, J. Guerra, E. Gonzales

Abstract:

This paper will outline a study of Brazil's industrial base of defense (IDB), through a bibliographic research method, combined with an analysis of macroeconomic data from several available public data platforms. This paper begins with a brief study about Brazilian national industry, including analyzes of productivity, income, outcome and jobs. Next, the research presents a study on the defense industry in Brazil, presenting the main national companies that operate in the aeronautical, army and naval branches. After knowing the main points of the Brazilian defense industry, data on the productivity of the defense industry of the main countries and competing companies of the Brazilian industry were analyzed, in order to summarize big cases in Brazil with a comparative analysis. Concerned the methodology, were used bibliographic research and the exploration of historical data series, in order to analyze information, to get trends and to make comparisons along the time. The research is finished with the main trends for the development of the Brazilian defense industry, comparing the current situation with the point of view of several countries.

Keywords: economics of defence, industry, trends, market

Procedia PDF Downloads 156
23811 Delineating Subsurface Linear Features and Faults Under Sedimentary Cover in the Bahira Basin Using Integrated Gravity and Magnetic Data

Authors: M. Lghoul, N. El Goumi, M. Guernouche

Abstract:

In order to predict the structural and tectonic framework of the Bahira basin and to have a 3D geological modeling of the basin, an integrated multidisciplinary work has been conducted using gravity, magnetic and geological data. The objective of the current study is delineating the subsurfacefeatures, faults, and geological limits, using airborne magnetic and gravity data analysis of the Bahira basin. To achieve our goal, we have applied different enhanced techniques on magnetic and gravity data: power spectral analysis techniques, reduction to pole (RTP), upward continuation, analytical signal, tilt derivative, total horizontal derivative, 3D Euler deconvolutionand source parameter imagining. The major lineaments/faults trend are: NE–SW, NW-SE, ENE–WSW, and WNW–ESE. The 3D Euler deconvolution analysis highlighted a number of fault trend, mainly in the ENE-WSW, WNW-ESE directions. The depth tothe top of the basement sources in the study area ranges between 200 m, in the southern and northern part of the Bahira basin, to 5000 m located in the Eastern part of the basin.

Keywords: magnetic, gravity, structural trend, depth to basement

Procedia PDF Downloads 132
23810 Copyright Clearance for Artificial Intelligence Training Data: Challenges and Solutions

Authors: Erva Akin

Abstract:

– The use of copyrighted material for machine learning purposes is a challenging issue in the field of artificial intelligence (AI). While machine learning algorithms require large amounts of data to train and improve their accuracy and creativity, the use of copyrighted material without permission from the authors may infringe on their intellectual property rights. In order to overcome copyright legal hurdle against the data sharing, access and re-use of data, the use of copyrighted material for machine learning purposes may be considered permissible under certain circumstances. For example, if the copyright holder has given permission to use the data through a licensing agreement, then the use for machine learning purposes may be lawful. It is also argued that copying for non-expressive purposes that do not involve conveying expressive elements to the public, such as automated data extraction, should not be seen as infringing. The focus of such ‘copy-reliant technologies’ is on understanding language rules, styles, and syntax and no creative ideas are being used. However, the non-expressive use defense is within the framework of the fair use doctrine, which allows the use of copyrighted material for research or educational purposes. The questions arise because the fair use doctrine is not available in EU law, instead, the InfoSoc Directive provides for a rigid system of exclusive rights with a list of exceptions and limitations. One could only argue that non-expressive uses of copyrighted material for machine learning purposes do not constitute a ‘reproduction’ in the first place. Nevertheless, the use of machine learning with copyrighted material is difficult because EU copyright law applies to the mere use of the works. Two solutions can be proposed to address the problem of copyright clearance for AI training data. The first is to introduce a broad exception for text and data mining, either mandatorily or for commercial and scientific purposes, or to permit the reproduction of works for non-expressive purposes. The second is that copyright laws should permit the reproduction of works for non-expressive purposes, which opens the door to discussions regarding the transposition of the fair use principle from the US into EU law. Both solutions aim to provide more space for AI developers to operate and encourage greater freedom, which could lead to more rapid innovation in the field. The Data Governance Act presents a significant opportunity to advance these debates. Finally, issues concerning the balance of general public interests and legitimate private interests in machine learning training data must be addressed. In my opinion, it is crucial that robot-creation output should fall into the public domain. Machines depend on human creativity, innovation, and expression. To encourage technological advancement and innovation, freedom of expression and business operation must be prioritised.

Keywords: artificial intelligence, copyright, data governance, machine learning

Procedia PDF Downloads 83
23809 Biosorption of Phenol onto Water Hyacinth Activated Carbon: Kinetics and Isotherm Study

Authors: Manoj Kumar Mahapatra, Arvind Kumar

Abstract:

Batch adsorption experiments were carried out for the removal of phenol from its aqueous solution using water hyancith activated carbon (WHAC) as an adsorbent. The sorption kinetics were analysed using pseudo-first order kinetics and pseudo-second order model, and it was observed that the sorption data tend to fit very well in pseudo-second order model for the entire sorption time. The experimental data were analyzed by the Langmuir and Freundlich isotherm models. Equilibrium data fitted well to the Freundlich model with a maximum biosorption capacity of 31.45 mg/g estimated using Langmuir model. The adsorption intensity 3.7975 represents a favorable adsorption condition.

Keywords: adsorption, isotherm, kinetics, phenol

Procedia PDF Downloads 446
23808 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication

Authors: Vedant Janapaty

Abstract:

Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.

Keywords: estuary, remote sensing, machine learning, Fourier transform

Procedia PDF Downloads 104
23807 Agricultural Water Consumption Estimation in the Helmand Basin

Authors: Mahdi Akbari, Ali Torabi Haghighi

Abstract:

Hamun Lakes, located in the Helmand Basin, consisting of four water bodies, were the greatest (>8500 km2) freshwater bodies in Iran plateau but have almost entirely desiccated over the last 20 years. The desiccation of the lakes caused dust storm in the region which has huge economic and health consequences on the inhabitants. The flow of the Hirmand (or Helmand) River, the most important feeding river, has decreased from 4 to 1.9 km3 downstream due to anthropogenic activities. In this basin, water is mainly consumed for farming. Due to the lack of in-situ data in the basin, this research utilizes remote-sensing data to show how croplands and consequently consumed water in the agricultural sector have changed. Based on Landsat NDVI, we suggest using a threshold of around 0.35-0.4 to detect croplands in the basin. Croplands of this basin has doubled since 1990, especially in the downstream of the Kajaki Dam (the biggest dam of the basin). Using PML V2 Actual Evapotranspiration (AET) data and considering irrigation efficiency (≈0.3), we estimate that the consumed water (CW) for farming. We found that CW has increased from 2.5 to over 7.5 km3 from 2002 to 2017 in this basin. Also, the annual average Potential Evapotranspiration (PET) of the basin has had a negative trend in the recent years, although the AET over croplands has an increasing trend. In this research, using remote sensing data, we covered lack of data in the studied area and highlighted anthropogenic activities in the upstream which led to the lakes desiccation in the downstream.

Keywords: Afghanistan-Iran transboundary Basin, Iran-Afghanistan water treaty, water use, lake desiccation

Procedia PDF Downloads 131
23806 Data-Driven Strategies for Enhancing Food Security in Vulnerable Regions: A Multi-Dimensional Analysis of Crop Yield Predictions, Supply Chain Optimization, and Food Distribution Networks

Authors: Sulemana Ibrahim

Abstract:

Food security remains a paramount global challenge, with vulnerable regions grappling with issues of hunger and malnutrition. This study embarks on a comprehensive exploration of data-driven strategies aimed at ameliorating food security in such regions. Our research employs a multifaceted approach, integrating data analytics to predict crop yields, optimizing supply chains, and enhancing food distribution networks. The study unfolds as a multi-dimensional analysis, commencing with the development of robust machine learning models harnessing remote sensing data, historical crop yield records, and meteorological data to foresee crop yields. These predictive models, underpinned by convolutional and recurrent neural networks, furnish critical insights into anticipated harvests, empowering proactive measures to confront food insecurity. Subsequently, the research scrutinizes supply chain optimization to address food security challenges, capitalizing on linear programming and network optimization techniques. These strategies intend to mitigate loss and wastage while streamlining the distribution of agricultural produce from field to fork. In conjunction, the study investigates food distribution networks with a particular focus on network efficiency, accessibility, and equitable food resource allocation. Network analysis tools, complemented by data-driven simulation methodologies, unveil opportunities for augmenting the efficacy of these critical lifelines. This study also considers the ethical implications and privacy concerns associated with the extensive use of data in the realm of food security. The proposed methodology outlines guidelines for responsible data acquisition, storage, and usage. The ultimate aspiration of this research is to forge a nexus between data science and food security policy, bestowing actionable insights to mitigate the ordeal of food insecurity. The holistic approach converging data-driven crop yield forecasts, optimized supply chains, and improved distribution networks aspire to revitalize food security in the most vulnerable regions, elevating the quality of life for millions worldwide.

Keywords: data-driven strategies, crop yield prediction, supply chain optimization, food distribution networks

Procedia PDF Downloads 62
23805 A Statistical Approach to Classification of Agricultural Regions

Authors: Hasan Vural

Abstract:

Turkey is a favorable country to produce a great variety of agricultural products because of her different geographic and climatic conditions which have been used to divide the country into four main and seven sub regions. This classification into seven regions traditionally has been used in order to data collection and publication especially related with agricultural production. Afterwards, nine agricultural regions were considered. Recently, the governmental body which is responsible of data collection and dissemination (Turkish Institute of Statistics-TIS) has used 12 classes which include 11 sub regions and Istanbul province. This study aims to evaluate these classification efforts based on the acreage of ten main crops in a ten years time period (1996-2005). The panel data grouped in 11 subregions has been evaluated by cluster and multivariate statistical methods. It was concluded that from the agricultural production point of view, it will be rather meaningful to consider three main and eight sub-agricultural regions throughout the country.

Keywords: agricultural region, factorial analysis, cluster analysis,

Procedia PDF Downloads 416
23804 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings

Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.

Keywords: building system, time series, diagnosis, outliers, delay, data gap

Procedia PDF Downloads 245
23803 A Prospective Neurosurgical Registry Evaluating the Clinical Care of Traumatic Brain Injury Patients Presenting to Mulago National Referral Hospital in Uganda

Authors: Benjamin J. Kuo, Silvia D. Vaca, Joao Ricardo Nickenig Vissoci, Catherine A. Staton, Linda Xu, Michael Muhumuza, Hussein Ssenyonjo, John Mukasa, Joel Kiryabwire, Lydia Nanjula, Christine Muhumuza, Henry E. Rice, Gerald A. Grant, Michael M. Haglund

Abstract:

Background: Traumatic Brain Injury (TBI) is disproportionally concentrated in low- and middle-income countries (LMICs), with the odds of dying from TBI in Uganda more than 4 times higher than in high income countries (HICs). The disparities in the injury incidence and outcome between LMICs and resource-rich settings have led to increased health outcomes research for TBIs and their associated risk factors in LMICs. While there have been increasing TBI studies in LMICs over the last decade, there is still a need for more robust prospective registries. In Uganda, a trauma registry implemented in 2004 at the Mulago National Referral Hospital (MNRH) showed that RTI is the major contributor (60%) of overall mortality in the casualty department. While the prior registry provides information on injury incidence and burden, it’s limited in scope and doesn’t follow patients longitudinally throughout their hospital stay nor does it focus specifically on TBIs. And although these retrospective analyses are helpful for benchmarking TBI outcomes, they make it hard to identify specific quality improvement initiatives. The relationship among epidemiology, patient risk factors, clinical care, and TBI outcomes are still relatively unknown at MNRH. Objective: The objectives of this study are to describe the processes of care and determine risk factors predictive of poor outcomes for TBI patients presenting to a single tertiary hospital in Uganda. Methods: Prospective data were collected for 563 TBI patients presenting to a tertiary hospital in Kampala from 1 June – 30 November 2016. Research Electronic Data Capture (REDCap) was used to systematically collect variables spanning 8 categories. Univariate and multivariate analysis were conducted to determine significant predictors of mortality. Results: 563 TBI patients were enrolled from 1 June – 30 November 2016. 102 patients (18%) received surgery, 29 patients (5.1%) intended for surgery failed to receive it, and 251 patients (45%) received non-operative management. Overall mortality was 9.6%, which ranged from 4.7% for mild and moderate TBI to 55% for severe TBI patients with GCS 3-5. Within each TBI severity category, mortality differed by management pathway. Variables predictive of mortality were TBI severity, more than one intracranial bleed, failure to receive surgery, high dependency unit admission, ventilator support outside of surgery, and hospital arrival delayed by more than 4 hours. Conclusions: The overall mortality rate of 9.6% in Uganda for TBI is high, and likely underestimates the true TBI mortality. Furthermore, the wide-ranging mortality (3-82%), high ICU fatality, and negative impact of care delays suggest shortcomings with the current triaging practices. Lack of surgical intervention when needed was highly predictive of mortality in TBI patients. Further research into the determinants of surgical interventions, quality of step-up care, and prolonged care delays are needed to better understand the complex interplay of variables that affect patient outcome. These insights guide the development of future interventions and resource allocation to improve patient outcomes.

Keywords: care continuum, global neurosurgery, Kampala Uganda, LMIC, Mulago, prospective registry, traumatic brain injury

Procedia PDF Downloads 236
23802 Development of Tourism Infrastructure and Cultural Heritage: Case of Gobustan Preserve

Authors: Rufat Nuriyev

Abstract:

Located in the eastern part of the Republic of Azerbaijan and on the western shore of the Caspian Sea, Gobustan National Reserve was inscribed as Gobustan Rock Art Cultural Landscape into the World Heritage List in 2007. Gobustan is an outstanding rock art landscape, where over 6000 rock engravings were found and registered, since the end of Upper Paleolithic up to the Middle Ages. Being a rock art center, the Gobustan seeks to stimulate public awareness and disseminate knowledge of prehistoric art to enrich educational, cultural and artistic communities regionally, nationally and internationally. Due to the Decree of the President of the Republic of Azerbaijan and the “Action Plan” , planned actions started to realize. Some of them implemented before of stipulated date. For the attraction of visitors and improvement of service quality in the museum-reserve, various activities are organized. The building of a new museum center at the foot of the Beyukdash Mountain has been completed in 2011. Main aims of the new museum building and exhibition was to provide better understanding of the importance of this monument for local community, Azerbaijanian culture and the world. In the Petroglyph Museum at Gobustan, digital and traditional media are closely integrated to reveal the complexity of historical, cultural and artistic meaning of prehistoric rock carvings of Gobustan. Alongside with electronic devices, the visitor gets opportunity of direct contact with artifacts and ancient rock carvings.

Keywords: Azerbaijan, Gobustan, rock art, museum

Procedia PDF Downloads 302
23801 Artificial Reproduction System and Imbalanced Dataset: A Mendelian Classification

Authors: Anita Kushwaha

Abstract:

We propose a new evolutionary computational model called Artificial Reproduction System which is based on the complex process of meiotic reproduction occurring between male and female cells of the living organisms. Artificial Reproduction System is an attempt towards a new computational intelligence approach inspired by the theoretical reproduction mechanism, observed reproduction functions, principles and mechanisms. A reproductive organism is programmed by genes and can be viewed as an automaton, mapping and reducing so as to create copies of those genes in its off springs. In Artificial Reproduction System, the binding mechanism between male and female cells is studied, parameters are chosen and a network is constructed also a feedback system for self regularization is established. The model then applies Mendel’s law of inheritance, allele-allele associations and can be used to perform data analysis of imbalanced data, multivariate, multiclass and big data. In the experimental study Artificial Reproduction System is compared with other state of the art classifiers like SVM, Radial Basis Function, neural networks, K-Nearest Neighbor for some benchmark datasets and comparison results indicates a good performance.

Keywords: bio-inspired computation, nature- inspired computation, natural computing, data mining

Procedia PDF Downloads 272
23800 Critical Evaluation and Analysis of Effects of Different Queuing Disciplines on Packets Delivery and Delay for Different Applications

Authors: Omojokun Gabriel Aju

Abstract:

Communication network is a process of exchanging data between two or more devices via some forms of transmission medium using communication protocols. The data could be in form of text, images, audio, video or numbers which can be grouped into FTP, Email, HTTP, VOIP or Video applications. The effectiveness of such data exchange will be proved if they are accurately delivered within specified time. While some senders will not really mind when the data is actually received by the receiving device, inasmuch as it is acknowledged to have been received by the receiver. The time a data takes to get to a receiver could be very important to another sender, as any delay could cause serious problem or even in some cases rendered the data useless. The validity or invalidity of a data after delay will therefore definitely depend on the type of data (information). It is therefore imperative for the network device (such as router) to be able to differentiate among the packets which are time sensitive and those that are not, when they are passing through the same network. So, here is where the queuing disciplines comes to play, to handle network resources when such network is designed to service widely varying types of traffics and manage the available resources according to the configured policies. Therefore, as part of the resources allocation mechanisms, a router within the network must implement some queuing discipline that governs how packets (data) are buffered while waiting to be transmitted. The implementation of the queuing discipline will regulate how the packets are buffered while waiting to be transmitted. In achieving this, various queuing disciplines are being used to control the transmission of these packets, by determining which of the packets get the highest priority, less priority and which packets are dropped. The queuing discipline will therefore control the packets latency by determining how long a packet can wait to be transmitted or dropped. The common queuing disciplines are first-in-first-out queuing, Priority queuing and Weighted-fair queuing (FIFO, PQ and WFQ). This paper critically evaluates and analyse through the use of Optimized Network Evaluation Tool (OPNET) Modeller, Version 14.5 the effects of three queuing disciplines (FIFO, PQ and WFQ) on the performance of 5 different applications (FTP, HTTP, E-Mail, Voice and Video) within specified parameters using packets sent, packets received and transmission delay as performance metrics. The paper finally suggests some ways in which networks can be designed to provide better transmission performance while using these queuing disciplines.

Keywords: applications, first-in-first-out queuing (FIFO), optimised network evaluation tool (OPNET), packets, priority queuing (PQ), queuing discipline, weighted-fair queuing (WFQ)

Procedia PDF Downloads 358
23799 Data Confidentiality in Public Cloud: A Method for Inclusion of ID-PKC Schemes in OpenStack Cloud

Authors: N. Nalini, Bhanu Prakash Gopularam

Abstract:

The term data security refers to the degree of resistance or protection given to information from unintended or unauthorized access. The core principles of information security are the confidentiality, integrity and availability, also referred as CIA triad. Cloud computing services are classified as SaaS, IaaS and PaaS services. With cloud adoption the confidential enterprise data are moved from organization premises to untrusted public network and due to this the attack surface has increased manifold. Several cloud computing platforms like OpenStack, Eucalyptus, Amazon EC2 offer users to build and configure public, hybrid and private clouds. While the traditional encryption based on PKI infrastructure still works in cloud scenario, the management of public-private keys and trust certificates is difficult. The Identity based Public Key Cryptography (also referred as ID-PKC) overcomes this problem by using publicly identifiable information for generating the keys and works well with decentralized systems. The users can exchange information securely without having to manage any trust information. Another advantage is that access control (role based access control policy) information can be embedded into data unlike in PKI where it is handled by separate component or system. In OpenStack cloud platform the keystone service acts as identity service for authentication and authorization and has support for public key infrastructure for auto services. In this paper, we explain OpenStack security architecture and evaluate the PKI infrastructure piece for data confidentiality. We provide method to integrate ID-PKC schemes for securing data while in transit and stored and explain the key measures for safe guarding data against security attacks. The proposed approach uses JPBC crypto library for key-pair generation based on IEEE P1636.3 standard and secure communication to other cloud services.

Keywords: data confidentiality, identity based cryptography, secure communication, open stack key stone, token scoping

Procedia PDF Downloads 384
23798 Recovery of Copper and Gold by Delamination of Printed Circuit Boards Followed by Leaching and Solvent Extraction Process

Authors: Kamalesh Kumar Singh

Abstract:

Due to increasing trends of electronic waste, specially the ICT related gadgets, their green recycling is still a greater challenge. This article presents a two-stage, eco-friendly hydrometallurgical route for the recovery of gold from the delaminated metallic layers of waste mobile phone Printed Circuit Boards (PCBs). Initially, mobile phone PCBs are downsized (1x1 cm²) and treated with an organic solvent dimethylacetamide (DMA) for the separation of metallic fraction from non-metallic glass fiber. In the first stage, liberated metallic sheets are used for the selective dissolution of copper in an aqueous leaching reagent. Influence of various parameters such as type of leaching reagent, the concentration of the solution, temperature, time and pulp density are optimized for the effective leaching (almost 100%) of copper. Results have shown that 3M nitric acid is a suitable reagent for copper leaching at room temperature and considering chemical features, gold remained in solid residue. In the second stage, the separated residue is used for the recovery of gold by using sulphuric acid with a combination of halide salt. In this halide leaching, Cl₂ or Br₂ is generated as an in-situ oxidant to improve the leaching of gold. Results have shown that almost 92 % of gold is recovered at the optimized parameters.

Keywords: printed circuit boards, delamination, leaching, solvent extraction, recovery

Procedia PDF Downloads 57
23797 Development of Catalyst, Incorporating Phosphinite Ligands, for Transfer Hydrogenation

Authors: S. Assylbekova, D. Zolotareva, A. Dauletbakov, Ye. Belyankova, S. Bayazit, A. Basharimova, A. Zazybin, A. Isimberlenova, A. Kakimova, M. Aydemir, A. Kairullinova

Abstract:

Transfer hydrogenation (TH) is a key process in organic chemistry, especially in pharmaceutical and agrochemical synthesis, offering a safer and more sustainable approach compared to traditional methods. This work is devoted to the synthesis and use of ruthenium catalysts containing phosphinite ligands in TH reactions. Ruthenium complexes are particularly noteworthy for their effectiveness in asymmetric TH. Their stability and adaptability to different reaction environments make them ideal for both laboratory-scale and industrial applications. Phosphinite ligands (P(OR)R'2) are used in the synthesis of complexes to improve their properties. These ligands are known for their ability to finely tune the electronic and steric properties of metal centers. The electron-donating nature of the phosphorus atom, combined with the variability in the R and R' groups, allows for significant customization of the catalyst's properties. The purpose and difference of the work is to study the incorporation of a hydrophilic ionic liquid into the composition of a phosphinite ligand, which will then be converted into a catalyst. The technique involves the synthesis of a phosphinite ligand with an ionic liquid at room temperature under an inert atmosphere and then a ruthenium complex. Next, the TH reactions of acetophenone and its derivatives are carried out using the resulting catalyst. The conversion of ketone to alcohol is analyzed using a gas chromatograph. This study contributes to the understanding of the influence of catalyst physico-chemical properties on transfer hydrogenation results.

Keywords: transfer hydrogenation, ruthenium, catalysts, phosphinite ligands

Procedia PDF Downloads 64
23796 A Proposed Framework for Digital Librarianship in Academic Libraries

Authors: Daniel Vaati Nzioka, John Oredo, Dorothy Muthoni Njiraine

Abstract:

The service delivery in academic libraries has been regressing due to the failure of Digital Librarians (DLns) to perform optimally. This study aimed at developing a proposed framework for digital librarianship in academic libraries with special emphasis to three selected public academic institutional libraries. The study’s specific objectives were to determine the roles played by the current DLns’ in academic libraries, establish job description of DLns’ in various academic libraries, ascertain DLns best practices, and to implement a viable digital librarianship conceptual framework. The study used a survey research with open-ended questionnaire designed as per the objectives of the study. A purposively selected sample of 30 Library and Information Science (LIS) professionals from the three selected academic libraries in charge of Digital Information Services (DIS) and managing electronic resources were selected and interviewed. A piloted self-administered questionnaire was used to gather information from these respondents. A total of thirty (30) questionnaires to the LIS professionals-ten from each of the three selected academic libraries were administered. The study developed a proposed conceptual framework for DLns’ that details the pertinent issues currently facing academic libraries when hiring DLns. The study recommended that the provided framework be adopted to guide library managers in identifying the needs of staff training and selecting the most adequate training method as well as settling on the best practices to be sent to staff for training and development.

Keywords: digital, academic, libraries, framework

Procedia PDF Downloads 108
23795 Improved Distance Estimation in Dynamic Environments through Multi-Sensor Fusion with Extended Kalman Filter

Authors: Iffat Ara Ebu, Fahmida Islam, Mohammad Abdus Shahid Rafi, Mahfuzur Rahman, Umar Iqbal, John Ball

Abstract:

The application of multi-sensor fusion for enhanced distance estimation accuracy in dynamic environments is crucial for advanced driver assistance systems (ADAS) and autonomous vehicles. Limitations of single sensors such as cameras or radar in adverse conditions motivate the use of combined camera and radar data to improve reliability, adaptability, and object recognition. A multi-sensor fusion approach using an extended Kalman filter (EKF) is proposed to combine sensor measurements with a dynamic system model, achieving robust and accurate distance estimation. The research utilizes the Mississippi State University Autonomous Vehicular Simulator (MAVS) to create a controlled environment for data collection. Data analysis is performed using MATLAB. Qualitative (visualization of fused data vs ground truth) and quantitative metrics (RMSE, MAE) are employed for performance assessment. Initial results with simulated data demonstrate accurate distance estimation compared to individual sensors. The optimal sensor measurement noise variance and plant noise variance parameters within the EKF are identified, and the algorithm is validated with real-world data from a Chevrolet Blazer. In summary, this research demonstrates that multi-sensor fusion with an EKF significantly improves distance estimation accuracy in dynamic environments. This is supported by comprehensive evaluation metrics, with validation transitioning from simulated to real-world data, paving the way for safer and more reliable autonomous vehicle control.

Keywords: sensor fusion, EKF, MATLAB, MAVS, autonomous vehicle, ADAS

Procedia PDF Downloads 43
23794 Equation for Predicting Inferior Vena Cava Diameter as a Potential Pointer for Heart Failure Diagnosis among Adult in Azare, Bauchi State, Nigeria

Authors: M. K. Yusuf, W. O. Hamman, U. E. Umana, S. B. Oladele

Abstract:

Background: Dilatation of the inferior vena cava (IVC) is used as the ultrasonic diagnostic feature in patients suspected of congestive heart failure. The IVC diameter has been reported to vary among the various body mass indexes (BMI) and body shape indexes (ABSI). Knowledge of these variations is useful in precision diagnoses of CHF by imaging scientists. Aim: The study aimed to establish an equation for predicting the ultrasonic mean diameter of the IVC among the various BMI/ABSI of inhabitants of Azare, Bauchi State-Nigeria. Methodology: Two hundred physically healthy adult subjects of both sexes were classified into under, normal, over, and obese weights using their BMIs after selection using a structured questionnaire following their informed consent for an abdominal ultrasound scan. The probe was placed on the midline of the body, halfway between the xiphoid process and the umbilicus, with the marker on the probe directed towards the patient's head to obtain a longitudinal view of the IVC. The maximum IVC diameter was measured from the subcostal view using the electronic caliper of the scan machine. The mean value of each group was obtained, and the results were analysed. Results: A novel equation {(IVC Diameter = 1.04 +0.01(X) where X= BMI} has been generated for determining the IVC diameter among the populace. Conclusion: An equation for predicting the IVC diameter from individual BMI values in apparently healthy subjects has been established.

Keywords: equation, ultrasonic, IVC diameter, body adiposities

Procedia PDF Downloads 72
23793 A User Identification Technique to Access Big Data Using Cloud Services

Authors: A. R. Manu, V. K. Agrawal, K. N. Balasubramanya Murthy

Abstract:

Authentication is required in stored database systems so that only authorized users can access the data and related cloud infrastructures. This paper proposes an authentication technique using multi-factor and multi-dimensional authentication system with multi-level security. The proposed technique is likely to be more robust as the probability of breaking the password is extremely low. This framework uses a multi-modal biometric approach and SMS to enforce additional security measures with the conventional Login/password system. The robustness of the technique is demonstrated mathematically using a statistical analysis. This work presents the authentication system along with the user authentication architecture diagram, activity diagrams, data flow diagrams, sequence diagrams, and algorithms.

Keywords: design, implementation algorithms, performance, biometric approach

Procedia PDF Downloads 476
23792 Input Data Balancing in a Neural Network PM-10 Forecasting System

Authors: Suk-Hyun Yu, Heeyong Kwon

Abstract:

Recently PM-10 has become a social and global issue. It is one of major air pollutants which affect human health. Therefore, it needs to be forecasted rapidly and precisely. However, PM-10 comes from various emission sources, and its level of concentration is largely dependent on meteorological and geographical factors of local and global region, so the forecasting of PM-10 concentration is very difficult. Neural network model can be used in the case. But, there are few cases of high concentration PM-10. It makes the learning of the neural network model difficult. In this paper, we suggest a simple input balancing method when the data distribution is uneven. It is based on the probability of appearance of the data. Experimental results show that the input balancing makes the neural networks’ learning easy and improves the forecasting rates.

Keywords: artificial intelligence, air quality prediction, neural networks, pattern recognition, PM-10

Procedia PDF Downloads 232