Search results for: physiological data extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26720

Search results for: physiological data extraction

23510 Recovery of Copper from Edge Trims of Printed Circuit Boards Using Acidithiobacillus Ferrooxidans: Bioleaching

Authors: Shashi Arya, Nand L. Singh, Samiksha Singh, Pradeep K. Mishra, Siddh N. Upadhyay

Abstract:

The enormous generation of E- waste and its recycling have greater environmental concern especially in developing countries like India. A major part of this waste comprises printed circuit boards (PCBs). Edge trims of PCBs have high copper content ranging between 25-60%. The extraction of various metals out of these PCBs is more or less a proven technology, wherein various hazardous chemicals are being used in the resource recovery, resulting into secondary pollution. The current trend of extracting of valuable metals is the utilization of microbial strains to eliminate the problem of a secondary pollutant. Keeping the above context in mind, this work aims at the enhanced recovery of copper from edge trims, through bioleaching using bacterial strain Acidithiobacillus ferrooxidans. The raw material such as motherboards, hard drives, floppy drives and DVD drives were obtained from the warehouse of the University. More than 90% copper could be extracted through bioleaching using Acidithiobacillus ferrooxidans. Inoculate concentration has merely insignificant effect over copper recovery above 20% inoculate concentration. Higher concentration of inoculation has the only initial advantage up to 2-4 days. The complete recovery has been obtained between 14- 24 days.

Keywords: acidithiobacillus ferrooxidans, bioleaching, e-waste, printed circuit boards

Procedia PDF Downloads 322
23509 Customized Design of Amorphous Solids by Generative Deep Learning

Authors: Yinghui Shang, Ziqing Zhou, Rong Han, Hang Wang, Xiaodi Liu, Yong Yang

Abstract:

The design of advanced amorphous solids, such as metallic glasses, with targeted properties through artificial intelligence signifies a paradigmatic shift in physical metallurgy and materials technology. Here, we developed a machine-learning architecture that facilitates the generation of metallic glasses with targeted multifunctional properties. Our architecture integrates the state-of-the-art unsupervised generative adversarial network model with supervised models, allowing the incorporation of general prior knowledge derived from thousands of data points across a vast range of alloy compositions, into the creation of data points for a specific type of composition, which overcame the common issue of data scarcity typically encountered in the design of a given type of metallic glasses. Using our generative model, we have successfully designed copper-based metallic glasses, which display exceptionally high hardness or a remarkably low modulus. Notably, our architecture can not only explore uncharted regions in the targeted compositional space but also permits self-improvement after experimentally validated data points are added to the initial dataset for subsequent cycles of data generation, hence paving the way for the customized design of amorphous solids without human intervention.

Keywords: metallic glass, artificial intelligence, mechanical property, automated generation

Procedia PDF Downloads 43
23508 R Data Science for Technology Management

Authors: Sunghae Jun

Abstract:

Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.

Keywords: technology management, R system, R data science, statistics, machine learning

Procedia PDF Downloads 450
23507 Mixture statistical modeling for predecting mortality human immunodeficiency virus (HIV) and tuberculosis(TB) infection patients

Authors: Mohd Asrul Affendi Bi Abdullah, Nyi Nyi Naing

Abstract:

The purpose of this study was to identify comparable manner between negative binomial death rate (NBDR) and zero inflated negative binomial death rate (ZINBDR) with died patients with (HIV + T B+) and (HIV + T B−). HIV and TB is a serious world wide problem in the developing country. Data were analyzed with applying NBDR and ZINBDR to make comparison which a favorable model is better to used. The ZINBDR model is able to account for the disproportionately large number of zero within the data and is shown to be a consistently better fit than the NBDR model. Hence, as a results ZINBDR model is a superior fit to the data than the NBDR model and provides additional information regarding the died mechanisms HIV+TB. The ZINBDR model is shown to be a use tool for analysis death rate according age categorical.

Keywords: zero inflated negative binomial death rate, HIV and TB, AIC and BIC, death rate

Procedia PDF Downloads 421
23506 Association between Physical Composition, Swimming Performance and Somatotype of Male Competitive Swimmers of Age Group 10-13 Years

Authors: Ranjit Singh

Abstract:

Body fat % lean body mass and body type play vital role in sports performance. A sports person who is having optional body composition can show its performance flawlessly whereas other who is not physical fit may be more prone to injury. Competitive swimming is an association of plethora of aspects like morphological, physiological, biochemical, biomechanical and psychological. The primary key of the present research is to examine the correlation among selected morphological dimensions such as height, weight, body fat%, lean body mass, somatotype and swimming performance. The present study also focused to investigate by potential deficiencies if any and to find out remedial measures to curb the training stresses. Thirty (age group 10-14 years) swimmers undergoing training under skilled and professional coaches were selected in the present study. The morphological variables and performance criterion like 50 meter swimming time and speed were calculated by using standard training methodology. Correlation coefficient among body composition, somatotype and performance variables were assessed by using standard statistical package SPSS. Mean height, weight, fat% and lean body mass of the present group is 150.97±8.68 cm, 44.0±9.34 kg., 15.97±4.42 % and 37.10±8.77 kg respectively. Somatotype of the young swimmers of this research is revealed ectomorphic mesomorph. The analysis of the results Illustrated that swimming performance is significantly correlated (p<0.05) with height, body weight, mesomorphoic component and lean body mass. Body fat is significantly and negatively correlated (p<0.05) with mesomorphic component, lean body mass and swimming speed. From this present study, it can be concluded that along with techniques and tactics other the physical attributes also play significant role in swimming performance which can help the swimmers to excel in higher level of competition and swimmers having improved morphological qualities can ultimately perform well.

Keywords: body fat, mass, mesomorphic component, somatotype

Procedia PDF Downloads 230
23505 Simultaneous Removal of Arsenic and Toxic Metals from Contaminated Soil: a Pilot-Scale Demonstration

Authors: Juan Francisco Morales Arteaga, Simon Gluhar, Anela Kaurin, Domen Lestan

Abstract:

Contaminated soils are recognized as one of the most pressing global environmental problems. As is one of the most hazardous elements: chronic exposure to arsenic has devastating effects on health, cardiovascular diseases, cancer, and eventually death. Pb, Zn and Cd are very highly toxic metals that affect almost every organ in the body. With this in mind, new technologies for soil remediation processes are urgently needed. Calcareous artificially contaminated soil containing 231 mg kg-1 As and historically contaminated with Pb, Zn and Cd was washed with a 1:1.5 solid-liquid ratio of 90 mM EDTA, 100 mM oxalic acid, and 50 mM sodium dithionite to remove 59, 75, 29, and 53% of As, Pb, Zn, and Cd, respectively. To reduce emissions of residual EDTA and chelated metals from the remediated soil, zero valent iron (ZVI) was added (1% w/w) to the slurry of the washed soil immediately prior to rinsing. Experimental controls were conducted without the addition of ZVI after remediation. The use of ZVI reduced metal leachability and minimized toxic emissions 21 days after remediation. After this time, NH4NO3 extraction was performed to determine the mobility of toxic elements in the soil. In addition, Unified Human BioaccessibilityMethod (UBM) was performed to quantify the bioaccessibility levels of metals in stimulated human gastric and gastrointestinal phases.

Keywords: soil remediation, soil science, soil washing, toxic metals removal

Procedia PDF Downloads 170
23504 Efficient Reuse of Exome Sequencing Data for Copy Number Variation Callings

Authors: Chen Wang, Jared Evans, Yan Asmann

Abstract:

With the quick evolvement of next-generation sequencing techniques, whole-exome or exome-panel data have become a cost-effective way for detection of small exonic mutations, but there has been a growing desire to accurately detect copy number variations (CNVs) as well. In order to address this research and clinical needs, we developed a sequencing coverage pattern-based method not only for copy number detections, data integrity checks, CNV calling, and visualization reports. The developed methodologies include complete automation to increase usability, genome content-coverage bias correction, CNV segmentation, data quality reports, and publication quality images. Automatic identification and removal of poor quality outlier samples were made automatically. Multiple experimental batches were routinely detected and further reduced for a clean subset of samples before analysis. Algorithm improvements were also made to improve somatic CNV detection as well as germline CNV detection in trio family. Additionally, a set of utilities was included to facilitate users for producing CNV plots in focused genes of interest. We demonstrate the somatic CNV enhancements by accurately detecting CNVs in whole exome-wide data from the cancer genome atlas cancer samples and a lymphoma case study with paired tumor and normal samples. We also showed our efficient reuses of existing exome sequencing data, for improved germline CNV calling in a family of the trio from the phase-III study of 1000 Genome to detect CNVs with various modes of inheritance. The performance of the developed method is evaluated by comparing CNV calling results with results from other orthogonal copy number platforms. Through our case studies, reuses of exome sequencing data for calling CNVs have several noticeable functionalities, including a better quality control for exome sequencing data, improved joint analysis with single nucleotide variant calls, and novel genomic discovery of under-utilized existing whole exome and custom exome panel data.

Keywords: bioinformatics, computational genetics, copy number variations, data reuse, exome sequencing, next generation sequencing

Procedia PDF Downloads 250
23503 [Keynote]: No-Trust-Zone Architecture for Securing Supervisory Control and Data Acquisition

Authors: Michael Okeke, Andrew Blyth

Abstract:

Supervisory Control And Data Acquisition (SCADA) as the state of the art Industrial Control Systems (ICS) are used in many different critical infrastructures, from smart home to energy systems and from locomotives train system to planes. Security of SCADA systems is vital since many lives depend on it for daily activities and deviation from normal operation could be disastrous to the environment as well as lives. This paper describes how No-Trust-Zone (NTZ) architecture could be incorporated into SCADA Systems in order to reduce the chances of malicious intent. The architecture is made up of two distinctive parts which are; the field devices such as; sensors, PLCs pumps, and actuators. The second part of the architecture is designed following lambda architecture, which is made up of a detection algorithm based on Particle Swarm Optimization (PSO) and Hadoop framework for data processing and storage. Apache Spark will be a part of the lambda architecture for real-time analysis of packets for anomalies detection.

Keywords: industrial control system (ics, no-trust-zone (ntz), particle swarm optimisation (pso), supervisory control and data acquisition (scada), swarm intelligence (SI)

Procedia PDF Downloads 335
23502 A Study on the Correlation Analysis between the Pre-Sale Competition Rate and the Apartment Unit Plan Factor through Machine Learning

Authors: Seongjun Kim, Jinwooung Kim, Sung-Ah Kim

Abstract:

The development of information and communication technology also affects human cognition and thinking, especially in the field of design, new techniques are being tried. In architecture, new design methodologies such as machine learning or data-driven design are being applied. In particular, these methodologies are used in analyzing the factors related to the value of real estate or analyzing the feasibility in the early planning stage of the apartment housing. However, since the value of apartment buildings is often determined by external factors such as location and traffic conditions, rather than the interior elements of buildings, data is rarely used in the design process. Therefore, although the technical conditions are provided, the internal elements of the apartment are difficult to apply the data-driven design in the design process of the apartment. As a result, the designers of apartment housing were forced to rely on designer experience or modular design alternatives rather than data-driven design at the design stage, resulting in a uniform arrangement of space in the apartment house. The purpose of this study is to propose a methodology to support the designers to design the apartment unit plan with high consumer preference by deriving the correlation and importance of the floor plan elements of the apartment preferred by the consumers through the machine learning and reflecting this information from the early design process. The data on the pre-sale competition rate and the elements of the floor plan are collected as data, and the correlation between pre-sale competition rate and independent variables is analyzed through machine learning. This analytical model can be used to review the apartment unit plan produced by the designer and to assist the designer. Therefore, it is possible to make a floor plan of apartment housing with high preference because it is possible to feedback apartment unit plan by using trained model when it is used in floor plan design of apartment housing.

Keywords: apartment unit plan, data-driven design, design methodology, machine learning

Procedia PDF Downloads 260
23501 Nonparametric Truncated Spline Regression Model on the Data of Human Development Index in Indonesia

Authors: Kornelius Ronald Demu, Dewi Retno Sari Saputro, Purnami Widyaningsih

Abstract:

Human Development Index (HDI) is a standard measurement for a country's human development. Several factors may have influenced it, such as life expectancy, gross domestic product (GDP) based on the province's annual expenditure, the number of poor people, and the percentage of an illiterate people. The scatter plot between HDI and the influenced factors show that the plot does not follow a specific pattern or form. Therefore, the HDI's data in Indonesia can be applied with a nonparametric regression model. The estimation of the regression curve in the nonparametric regression model is flexible because it follows the shape of the data pattern. One of the nonparametric regression's method is a truncated spline. Truncated spline regression is one of the nonparametric approach, which is a modification of the segmented polynomial functions. The estimator of a truncated spline regression model was affected by the selection of the optimal knots point. Knot points is a focus point of spline truncated functions. The optimal knots point was determined by the minimum value of generalized cross validation (GCV). In this article were applied the data of Human Development Index with a truncated spline nonparametric regression model. The results of this research were obtained the best-truncated spline regression model to the HDI's data in Indonesia with the combination of optimal knots point 5-5-5-4. Life expectancy and the percentage of an illiterate people were the significant factors depend to the HDI in Indonesia. The coefficient of determination is 94.54%. This means the regression model is good enough to applied on the data of HDI in Indonesia.

Keywords: generalized cross validation (GCV), Human Development Index (HDI), knots point, nonparametric regression, truncated spline

Procedia PDF Downloads 327
23500 Bioconversion of Capsaicin Using the Optimized Culture Broth of Lipase Producing Bacterium of Stenotrophomonas maltophilia

Authors: Doostishoar Farzad, Forootanfar Hamid, Hasan-Bikdashti Morvarid, Faramarzi Mohammad Ali, Ameri Atefe

Abstract:

Introduction: Chili peppers and related plants in the family of capsaicum produce a mixture of capsaicins represent anticarcinogenic, antimutagenic, and chemopreventive properties. Vanillylamine, the main product of capsaicin hydrolysis is applied as a precursor for manufacturing of natural vanillin (a famous flavor). It is also used in the production of synthetic capsaicins harboring a wide variety of physiological and biological activities such as antibacterial and anti-inflammatory effects as well as enhancing of adrenal catecholamine secretion, analgesic, and antioxidative activities. The ability of some lipases, such as Novozym 677 BG and Novozym 435 and also some proteases e.g. trypsine and penicillin acylase, in capsaicin hydrolysis and green synthesis of vanillylamine has been investigated. In the present study the optimized culture broth of a newly isolated lipase-producing bacterial strain (Stenotrophomonas maltophilia) applied for the hydrolysis of capsaicin. Materials and methods: In order to compare hydrolytic activity of optimized and basal culture broth through capsaicin 2 mL of each culture broth (as sources of lipase) was introduced to capsaicin solution (500 mg/L) and then the reaction mixture (total volume of 3 mL) was incubated at 40 °C and 120 rpm. Samples were taken every 2 h and analyzed for vanillylamine formation using HPLC. Same reaction mixture containing boiled supernatant (to inactivate lipase) designed as blank and each experiment was done in triplicate. Results: 215 mg/L of vanillylamine was produced after the treatment of capsaicin using the optimized medium for 18 h, while only 61 mg/L of vanillylamine was detected in presence of the basal medium under the same conditions. No capsaicin conversion was observed in the blank sample, in which lipase activity was suppressed by boiling of the sample for 10 min. Conclusion: The application of optimized broth culture for the hydrolysis of capsaicin led to a 43% conversion of that pungent compound to vanillylamine.

Keywords: Capsaicin, green synthesis, lipase, stenotrophomonas maltophilia

Procedia PDF Downloads 476
23499 Extraction of Inulin from Cichorium Intybus and Its Application as Fat Replacer in Yoghurt

Authors: Hafiz Khuram Wasim Aslam, Muhammad Saeed, Azam Shakeel, Muhammad Inam Ur Raheem, Moazzam Rafiq Khan, Muhammad Atif Randhawa

Abstract:

Inulin is significant ingredient used in food industry that functions technologically as a fat replacer often without compromising taste and texture. In this study inulin was extracted from the chicory roots and the effect of inulin addition as a fat replacer on the physiochemical, microbiological and sensory properties of non-fat yogurt was investigated. The supplementation of chicory inulin reduced the magnitude of firmness in comparison with non-inulin ¬supplemented non-fat yoghurt. Higher values of acidity were observed due to the more microbial fermentation in the inulin containing yogurt as compared to non-inulin yogurt and were in the range of 0.56 to 0.75 during storage days. Syneresis in control sample increased from 43.9% to 47.9% during the storage study. However inulin addition at different treatment enhanced syneresis from 44.5% to 47.6%. Inulin addition at various concentrations caused an increase in the TPC due to its probiotic effect. No effects of inuline addition on fat and protein contents were observed. Non-fat yoghurt supplemented with inulin demonstrated sensory behavior better than that of the control yoghurt. The most important effect of the addition of inulin to non-fat yoghurt is an increase in the sensory attributes appearance, body and texture, taste and mouth feel, overall acceptability. On an average, yoghurt supplemented with 1 to 2% inulin was better in overall acceptance as compared to control yoghurt.

Keywords: inulin, fat replacer, yoghurt, sensory evaluation, low fat

Procedia PDF Downloads 582
23498 Impact of Protean Career Attitude on Career Success with the Mediating Effect of Career Insight

Authors: Prabhashini Wijewantha

Abstract:

This study looks at the impact of protean career attitude of employees on their career success and next it looks at the mediation effect of career insights on the above relationship. Career success is defined as the accomplishment of desirable work related outcomes at any point in person’s work experiences over time and it comprises of two sub variables, namely, career satisfaction and perceived employability. Protean career attitude was measured using the eight items from the Self Directedness subscale of the Protean Career Attitude scale developed by Briscoe and Hall, where as career satisfaction was measured by the three item scale developed by Martine, Eddleston, and Veiga. Perceived employability was also evaluated using three items and career insight was measured using fourteen items that were adapted and used by De Vos and Soens. Data were collected from a sample of 300 mid career executives in Sri Lanka deploying the survey strategy and data were analyzed using the SPSS and AMOS software version 20.0. A preliminary analysis of data was initially performed where data were screened and reliability and validity were ensured. Next a simple regression analysis was performed to test the direct impact of protean career attitude on career success and the hypothesis was supported. The Baron and Kenney’s four steps, three regressions approach for mediator testing was used to calculate the mediation effect of career insight on the above relationship and a partial mediation was supported by the data. Finally theoretical and practical implications are discussed.

Keywords: career success, career insight, mid career MBAs, protean career attitude

Procedia PDF Downloads 353
23497 Studying the Influence of Systematic Pre-Occupancy Data Collection through Post-Occupancy Evaluation: A Shift in the Architectural Design Process

Authors: Noor Abdelhamid, Donovan Nelson, Cara Prosser

Abstract:

The architectural design process could be mapped out as a dialogue between designer and user that is constructed across multiple phases with the overarching goal of aligning design outcomes with user needs. Traditionally, this dialogue is bounded within a preliminary phase of determining factors that will direct the design intent, and a completion phase, of handing off the project to the client. Pre- and post-occupancy evaluations (P/POE’s) could provide an alternative process by extending this dialogue on both ends of the design process. The purpose of this research is to study the influence of systematic pre-occupancy data collection in achieving design goals by conducting post-occupancy evaluations of two case studies. In the context of this study, systematic pre-occupancy data collection is defined as the preliminary documentation of the existing conditions that helps portray stakeholders’ needs. When implemented, pre-occupancy occurs during the early phases of the architectural design process, utilizing the information to shape the design intent. Investigative POE’s are performed on two case studies with distinct early design approaches to understand how the current space is impacting user needs, establish design outcomes, and inform future strategies. The first case study underwent systematic pre-occupancy data collection and synthesis, while the other represents the traditional, uncoordinated practice of informally collecting data during an early design phase. POE’s target the dynamics between the building and its occupants by studying how spaces are serving the needs of the users. Data collection for this study consists of user surveys, audiovisual materials, and observations during regular site visits. Mixed methods of qualitative and quantitative analyses are synthesized to identify patterns in the data. The paper concludes by positioning value on both sides of the architectural design process: the integration of systematic pre-occupancy methods in the early phases and the reinforcement of a continued dialogue between building and design team after building completion.

Keywords: architecture, design process, pre-occupancy data, post-occupancy evaluation

Procedia PDF Downloads 157
23496 An Analysis of Oil Price Changes and Other Factors Affecting Iranian Food Basket: A Panel Data Method

Authors: Niloofar Ashktorab, Negar Ashktorab

Abstract:

Oil exports fund nearly half of Iran’s government expenditures, since many years other countries have been imposed different sanctions against Iran. Sanctions that primarily target Iran’s key energy sector have harmed Iran’s economy. The strategic effects of sanctions might be reduction as Iran adjusts to them economically. In this study, we evaluate the impact of oil price and sanctions against Iran on food commodity prices by using panel data method. Here, we find that the food commodity prices, the oil price and real exchange rate are stationary. The results show positive effect of oil price changes, real exchange rate and sanctions on food commodity prices.

Keywords: oil price, food basket, sanctions, panel data, Iran

Procedia PDF Downloads 349
23495 A Proposed Framework for Software Redocumentation Using Distributed Data Processing Techniques and Ontology

Authors: Laila Khaled Almawaldi, Hiew Khai Hang, Sugumaran A. l. Nallusamy

Abstract:

Legacy systems are crucial for organizations, but their intricacy and lack of documentation pose challenges for maintenance and enhancement. Redocumentation of legacy systems is vital for automatically or semi-automatically creating documentation for software lacking sufficient records. It aims to enhance system understandability, maintainability, and knowledge transfer. However, existing redocumentation methods need improvement in data processing performance and document generation efficiency. This stems from the necessity to efficiently handle the extensive and complex code of legacy systems. This paper proposes a method for semi-automatic legacy system re-documentation using semantic parallel processing and ontology. Leveraging parallel processing and ontology addresses current challenges by distributing the workload and creating documentation with logically interconnected data. The paper outlines challenges in legacy system redocumentation and suggests a method of redocumentation using parallel processing and ontology for improved efficiency and effectiveness.

Keywords: legacy systems, redocumentation, big data analysis, parallel processing

Procedia PDF Downloads 36
23494 Fully Eulerian Finite Element Methodology for the Numerical Modeling of the Dynamics of Heart Valves

Authors: Aymen Laadhari

Abstract:

During the last decade, an increasing number of contributions have been made in the fields of scientific computing and numerical methodologies applied to the study of the hemodynamics in the heart. In contrast, the numerical aspects concerning the interaction of pulsatile blood flow with highly deformable thin leaflets have been much less explored. This coupled problem remains extremely challenging and numerical difficulties include e.g. the resolution of full Fluid-Structure Interaction problem with large deformations of extremely thin leaflets, substantial mesh deformations, high transvalvular pressure discontinuities, contact between leaflets. Although the Lagrangian description of the structural motion and strain measures is naturally used, many numerical complexities can arise when studying large deformations of thin structures. Eulerian approaches represent a promising alternative to readily model large deformations and handle contact issues. We present a fully Eulerian finite element methodology tailored for the simulation of pulsatile blood flow in the aorta and sinus of Valsalva interacting with highly deformable thin leaflets. Our method enables to use a fluid solver on a fixed mesh, whilst being able to easily model the mechanical properties of the valve. We introduce a semi-implicit time integration scheme based on a consistent NewtonRaphson linearization. A variant of the classical Newton method is introduced and guarantees a third-order convergence. High-fidelity computational geometries are built and simulations are performed under physiological conditions. We address in detail the main features of the proposed method, and we report several experiments with the aim of illustrating its accuracy and efficiency.

Keywords: eulerian, level set, newton, valve

Procedia PDF Downloads 275
23493 Effects of Irrigation Scheduling and Soil Management on Maize (Zea mays L.) Yield in Guinea Savannah Zone of Nigeria

Authors: I. Alhassan, A. M. Saddiq, A. G. Gashua, K. K. Gwio-Kura

Abstract:

The main objective of any irrigation program is the development of an efficient water management system to sustain crop growth and development and avoid physiological water stress in the growing plants. Field experiment to evaluate the effects of some soil moisture conservation practices on yield and water use efficiency (WUE) of maize was carried out in three locations (i.e. Mubi and Yola in the northern Guinea Savannah and Ganye in the southern Guinea Savannah of Adamawa State, Nigeria) during the dry seasons of 2013 and 2014. The experiment consisted of three different irrigation levels (7, 10 and 12 day irrigation intervals), two levels of mulch (mulch and un-mulched) and two tillage practices (no tillage and minimum tillage) arranged in a randomized complete block design with split-split plot arrangement and replicated three times. The Blaney-Criddle method was used for measuring crop evapotranspiration. The results indicated that seven-day irrigation intervals and mulched treatment were found to have significant effect (P>0.05) on grain yield and water use efficiency in all the locations. The main effect of tillage was non-significant (P<0.05) on grain yield and WUE. The interaction effects of irrigation and mulch were significant (P>0.05) on grain yield and WUE at Mubi and Yola. Generally, higher grain yield and WUE were recorded on mulched and seven-day irrigation intervals, whereas lower values were recorded on un-mulched with 12-day irrigation intervals. Tillage exerts little influence on the yield and WUE. Results from Ganye were found to be generally higher than those recorded in Mubi and Yola; it also showed that an irrigation interval of 10 days with mulching could be adopted for the Ganye area, while seven days interval is more appropriate for Mubi and Yola.

Keywords: irrigation, maize, mulching, tillage, savanna

Procedia PDF Downloads 206
23492 Exploratory Analysis of A Review of Nonexistence Polarity in Native Speech

Authors: Deawan Rakin Ahamed Remal, Sinthia Chowdhury, Sharun Akter Khushbu, Sheak Rashed Haider Noori

Abstract:

Native Speech to text synthesis has its own leverage for the purpose of mankind. The extensive nature of art to speaking different accents is common but the purpose of communication between two different accent types of people is quite difficult. This problem will be motivated by the extraction of the wrong perception of language meaning. Thus, many existing automatic speech recognition has been placed to detect text. Overall study of this paper mentions a review of NSTTR (Native Speech Text to Text Recognition) synthesis compared with Text to Text recognition. Review has exposed many text to text recognition systems that are at a very early stage to comply with the system by native speech recognition. Many discussions started about the progression of chatbots, linguistic theory another is rule based approach. In the Recent years Deep learning is an overwhelming chapter for text to text learning to detect language nature. To the best of our knowledge, In the sub continent a huge number of people speak in Bangla language but they have different accents in different regions therefore study has been elaborate contradictory discussion achievement of existing works and findings of future needs in Bangla language acoustic accent.

Keywords: TTR, NSTTR, text to text recognition, deep learning, natural language processing

Procedia PDF Downloads 121
23491 Abdominal Organ Segmentation in CT Images Based On Watershed Transform and Mosaic Image

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

Accurate Liver, spleen and kidneys segmentation in abdominal CT images is one of the most important steps for computer aided abdominal organs pathology diagnosis. In this paper, we have proposed a new semi-automatic algorithm for Liver, spleen and kidneys area extraction in abdominal CT images. Our proposed method is based on hierarchical segmentation and watershed algorithm. In our approach, a powerful technique has been designed to suppress over-segmentation based on mosaic image and on the computation of the watershed transform. The algorithm is currency in two parts. In the first, we seek to improve the quality of the gradient-mosaic image. In this step, we propose a method for improving the gradient-mosaic image by applying the anisotropic diffusion filter followed by the morphological filters. Thereafter we proceed to the hierarchical segmentation of the liver, spleen and kidney. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.

Keywords: anisotropic diffusion filter, CT images, morphological filter, mosaic image, multi-abdominal organ segmentation, mosaic image, the watershed algorithm

Procedia PDF Downloads 487
23490 Effects of Neem (Azadirachta indica A. Juss) Kernel Inclusion in Broiler Diet on Growth Performance, Organ Weight and Gut Morphometry

Authors: Olatundun Bukola Ezekiel, Adejumo Olusoji

Abstract:

A feeding trial was conducted with 100 two-weeks old broiler chicken to evaluate the influence of inclusion in broiler diets at 0, 2.5, 5, 7.5 and 10% neem kernel (used to replace equal quantity of maize) on their performance, organ weight and gut morphometry. The birds were randomly allotted to five dietary treatments, each treatment having four replicates consisting of five broilers in a completely randomized design. The diets were formulated to be iso-nitrogenous (23% CP). Weekly feed intake and changes in body weight were calculated and feed efficiency determined. At the end of the 28-day feeding trial, four broilers per treatment were selected and sacrificed for carcass evaluation. Results were subjected to statistical analysis using the analysis of variance procedures of Statistical Analysis Software The treatment means were presented with group standard errors of means and where significant, were compared using the Duncan multiple range test of the same software. The results showed that broilers fed 2.5% neem kernel inclusion diets had growth performance statistically comparable to those fed the control diet. Birds on 5, 7.5 and 10% neem kernel diets showed significant (P<0.05) increase in relative weight of liver. The absolute weight of spleen also increased significantly (P<0.05) in birds on 10 % neem kernel diet. More than 5 % neem kernel diets gave significant (P<0.05) increase in the relative weight of the kidney. The length of the small intestine significantly increased in birds fed 7.5 and 10% neem kernel diets. Significant differences (P<0.05) did not occur in the length of the large intestine, right and left caeca. It is recommended that neem kernel can be included up to 2.5% in broiler chicken diet without any deleterious effects on the performance and physiological status of the birds.

Keywords: broiler chicken, growth performance, gut morphometry, neem kernel, organ weight

Procedia PDF Downloads 755
23489 Randomly Casted Single-Wall Carbon Nanotubes Films for High Performance Hybrid Photovoltaic Devices

Authors: My Ali El Khakani

Abstract:

Single-wall Carbon nanotubes (SWCNTs) possess an unprecedented combination of unique properties that make them highly promising for suitable for a new generation of photovoltaic (PV) devices. Prior to discussing the integration of SWCNTs films into effective PV devices, we will briefly highlight our work on the synthesis of SWCNTs by means of the KrF pulsed laser deposition technique, their purification and transfer onto n-silicon substrates to form p-n junctions. Some of the structural and optoelectronic properties of SWCNTs relevant to PV applications will be emphasized. By varying the SWCNTs film density (µg/cm2), we were able to point out the existence of an optimum value that yields the highest photoconversion efficiency (PCE) of ~10%. Further control of the doping of the p-SWCNTs films, through their exposure to nitric acid vapors, along with the insertion of an optimized hole-extraction-layer in the p-SWCNTs/n-Si hybrid devices permitted to achieve a PCE value as high as 14.2%. Such a high PCE value demonstrates the full potential of these p-SWCNTs/n-Si devices for sunlight photoconversion. On the other hand, by examining both the optical transmission and electrical conductance of the SWCNTs’ films, we established a figure of merit (FOM) that was shown to correlate well with the PCE performance. Such a direct relationship between the FOM and the PCE can be used as a guide for further PCE enhancement of these novel p-SWCNTs/n-Si PV devices.

Keywords: carbon nanotubes (CNTs), CNTs-silicon hybrid devices, photoconversion, photovoltaic devices, pulsed laser deposition

Procedia PDF Downloads 109
23488 Finite Element Modeling of Aortic Intramural Haematoma Shows Size Matters

Authors: Aihong Zhao, Priya Sastry, Mark L Field, Mohamad Bashir, Arvind Singh, David Richens

Abstract:

Objectives: Intramural haematoma (IMH) is one of the pathologies, along with acute aortic dissection, that present as Acute Aortic Syndrome (AAS). Evidence suggests that unlike aortic dissection, some intramural haematomas may regress with medical management. However, intramural haematomas have been traditionally managed like acute aortic dissections. Given that some of these pathologies may regress with conservative management, it would be useful to be able to identify which of these may not need high risk emergency intervention. A computational aortic model was used in this study to try and identify intramural haematomas with risk of progression to aortic dissection. Methods: We created a computational model of the aorta with luminal blood flow. Reports in the literature have identified 11 mm as the radial clot thickness that is associated with heightened risk of progression of intramural haematoma. Accordingly, haematomas of varying sizes were implanted in the modeled aortic wall to test this hypothesis. The model was exposed to physiological blood flows and the stresses and strains in each layer of the aortic wall were recorded. Results: Size and shape of clot were seen to affect the magnitude of aortic stresses. The greatest stresses and strains were recorded in the intima of the model. When the haematoma exceeded 10 mm in all dimensions, the stress on the intima reached breaking point. Conclusion: Intramural clot size appears to be a contributory factor affecting aortic wall stress. Our computer simulation corroborates clinical evidence in the literature proposing that IMH diameter greater than 11 mm may be predictive of progression. This preliminary report suggests finite element modelling of the aortic wall may be a useful process by which to examine putative variables important in predicting progression or regression of intramural haematoma.

Keywords: intramural haematoma, acute aortic syndrome, finite element analysis,

Procedia PDF Downloads 427
23487 Armenian Refugees in Early 20th C Japan: Quantitative Analysis on Their Number Based on Japanese Historical Data with the Comparison of a Foreign Historical Data

Authors: Meline Mesropyan

Abstract:

At the beginning of the 20th century, Japan served as a transit point for Armenian refugees fleeing the 1915 Genocide. However, research on Armenian refugees in Japan is sparse, and the Armenian Diaspora has never taken root in Japan. Consequently, Japan has not been considered a relevant research site for studying Armenian refugees. The primary objective of this study is to shed light on the number of Armenian refugees who passed through Japan between 1915 and 1930. Quantitative analyses will be conducted based on newly uncovered Japanese archival documents. Subsequently, the Japanese data will be compared to American immigration data to estimate the potential number of refugees in Japan during that period. This under-researched area is relevant to both the Armenian Diaspora and refugee studies in Japan. By clarifying the number of refugees, this study aims to enhance understanding of Japan's treatment of refugees and the extent of humanitarian efforts conducted by organizations and individuals in Japan, contributing to the broader field of historical refugee studies.

Keywords: Armenian genocide, Armenian refugees, Japanese statistics, number of refugees

Procedia PDF Downloads 44
23486 Building Green Infrastructure Networks Based on Cadastral Parcels Using Network Analysis

Authors: Gon Park

Abstract:

Seoul in South Korea established the 2030 Seoul City Master Plan that contains green-link projects to connect critical green areas within the city. However, the plan does not have detailed analyses for green infrastructure to incorporate land-cover information to many structural classes. This study maps green infrastructure networks of Seoul for complementing their green plans with identifying and raking green areas. Hubs and links of main elements of green infrastructure have been identified from incorporating cadastral data of 967,502 parcels to 135 of land use maps using geographic information system. Network analyses were used to rank hubs and links of a green infrastructure map with applying a force-directed algorithm, weighted values, and binary relationships that has metrics of density, distance, and centrality. The results indicate that network analyses using cadastral parcel data can be used as the framework to identify and rank hubs, links, and networks for the green infrastructure planning under a variable scenarios of green areas in cities.

Keywords: cadastral data, green Infrastructure, network analysis, parcel data

Procedia PDF Downloads 193
23485 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms

Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao

Abstract:

Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.

Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50

Procedia PDF Downloads 135
23484 The Effect of CPU Location in Total Immersion of Microelectronics

Authors: A. Almaneea, N. Kapur, J. L. Summers, H. M. Thompson

Abstract:

Meeting the growth in demand for digital services such as social media, telecommunications, and business and cloud services requires large scale data centres, which has led to an increase in their end use energy demand. Generally, over 30% of data centre power is consumed by the necessary cooling overhead. Thus energy can be reduced by improving the cooling efficiency. Air and liquid can both be used as cooling media for the data centre. Traditional data centre cooling systems use air, however liquid is recognised as a promising method that can handle the more densely packed data centres. Liquid cooling can be classified into three methods; rack heat exchanger, on-chip heat exchanger and full immersion of the microelectronics. This study quantifies the improvements of heat transfer specifically for the case of immersed microelectronics by varying the CPU and heat sink location. Immersion of the server is achieved by filling the gap between the microelectronics and a water jacket with a dielectric liquid which convects the heat from the CPU to the water jacket on the opposite side. Heat transfer is governed by two physical mechanisms, which is natural convection for the fixed enclosure filled with dielectric liquid and forced convection for the water that is pumped through the water jacket. The model in this study is validated with published numerical and experimental work and shows good agreement with previous work. The results show that the heat transfer performance and Nusselt number (Nu) is improved by 89% by placing the CPU and heat sink on the bottom of the microelectronics enclosure.

Keywords: CPU location, data centre cooling, heat sink in enclosures, immersed microelectronics, turbulent natural convection in enclosures

Procedia PDF Downloads 268
23483 A Macroeconomic Analysis of Defense Industry: Comparisons, Trends and Improvements in Brazil and in the World

Authors: J. Fajardo, J. Guerra, E. Gonzales

Abstract:

This paper will outline a study of Brazil's industrial base of defense (IDB), through a bibliographic research method, combined with an analysis of macroeconomic data from several available public data platforms. This paper begins with a brief study about Brazilian national industry, including analyzes of productivity, income, outcome and jobs. Next, the research presents a study on the defense industry in Brazil, presenting the main national companies that operate in the aeronautical, army and naval branches. After knowing the main points of the Brazilian defense industry, data on the productivity of the defense industry of the main countries and competing companies of the Brazilian industry were analyzed, in order to summarize big cases in Brazil with a comparative analysis. Concerned the methodology, were used bibliographic research and the exploration of historical data series, in order to analyze information, to get trends and to make comparisons along the time. The research is finished with the main trends for the development of the Brazilian defense industry, comparing the current situation with the point of view of several countries.

Keywords: economics of defence, industry, trends, market

Procedia PDF Downloads 147
23482 Delineating Subsurface Linear Features and Faults Under Sedimentary Cover in the Bahira Basin Using Integrated Gravity and Magnetic Data

Authors: M. Lghoul, N. El Goumi, M. Guernouche

Abstract:

In order to predict the structural and tectonic framework of the Bahira basin and to have a 3D geological modeling of the basin, an integrated multidisciplinary work has been conducted using gravity, magnetic and geological data. The objective of the current study is delineating the subsurfacefeatures, faults, and geological limits, using airborne magnetic and gravity data analysis of the Bahira basin. To achieve our goal, we have applied different enhanced techniques on magnetic and gravity data: power spectral analysis techniques, reduction to pole (RTP), upward continuation, analytical signal, tilt derivative, total horizontal derivative, 3D Euler deconvolutionand source parameter imagining. The major lineaments/faults trend are: NE–SW, NW-SE, ENE–WSW, and WNW–ESE. The 3D Euler deconvolution analysis highlighted a number of fault trend, mainly in the ENE-WSW, WNW-ESE directions. The depth tothe top of the basement sources in the study area ranges between 200 m, in the southern and northern part of the Bahira basin, to 5000 m located in the Eastern part of the basin.

Keywords: magnetic, gravity, structural trend, depth to basement

Procedia PDF Downloads 128
23481 Assisted Prediction of Hypertension Based on Heart Rate Variability and Improved Residual Networks

Authors: Yong Zhao, Jian He, Cheng Zhang

Abstract:

Cardiovascular diseases caused by hypertension are extremely threatening to human health, and early diagnosis of hypertension can save a large number of lives. Traditional hypertension detection methods require special equipment and are difficult to detect continuous blood pressure changes. In this regard, this paper first analyzes the principle of heart rate variability (HRV) and introduces sliding window and power spectral density (PSD) to analyze the time domain features and frequency domain features of HRV, and secondly, designs an HRV-based hypertension prediction network by combining Resnet, attention mechanism, and multilayer perceptron, which extracts the frequency domain through the improved ResNet18 features through a modified ResNet18, its fusion with time-domain features through an attention mechanism, and the auxiliary prediction of hypertension through a multilayer perceptron. Finally, the network was trained and tested using the publicly available SHAREE dataset on PhysioNet, and the test results showed that this network achieved 92.06% prediction accuracy for hypertension and outperformed K Near Neighbor(KNN), Bayes, Logistic, and traditional Convolutional Neural Network(CNN) models in prediction performance.

Keywords: feature extraction, heart rate variability, hypertension, residual networks

Procedia PDF Downloads 94