Search results for: real anthropometric database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7001

Search results for: real anthropometric database

4661 Code Evaluation on Web-Shear Capacity of Presstressed Hollow-Core Slabs

Authors: Min-Kook Park, Deuck Hang Lee, Hyun Mo Yang, Jae Hyun Kim, Kang Su Kim

Abstract:

Prestressed hollow-core slabs (HCS) are structurally optimized precast units with light-weight hollowed-sections and very economical due to the mass production by a unique production method. They have been thus widely used in the precast concrete constructions in many countries all around the world. It is, however, difficult to provide shear reinforcement in HCS units produced by the extrusion method, and thus all the shear forces should be resisted solely by concrete webs in the HCS units. This means that, for the HCS units, it is very important to estimate the contribution of web concrete to the shear resistance accurately. In design codes, however, the shear strengths for HCS units are estimated by the same equations that are used for typical prestressed concrete members, which were determined from the calibrations to experimental results of conventional prestressed concrete members other than HCS units. In this study, therefore, shear test results of HCS members with a wide range of influential variables were collected, and the shear strength equations in design codes were thoroughly examined by comparing to the experimental results in the shear database of HCS members. Acknowledgement: This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(NRF-2016R1A2B2010277).

Keywords: hollow-core, web-shear, precast concrete, prestress, capacity

Procedia PDF Downloads 508
4660 Study of Mixing Conditions for Different Endothelial Dysfunction in Arteriosclerosis

Authors: Sara Segura, Diego Nuñez, Miryam Villamil

Abstract:

In this work, we studied the microscale interaction of foreign substances with blood inside an artificial transparent artery system that represents medium and small muscular arteries. This artery system had channels ranging from 75 μm to 930 μm and was fabricated using glass and transparent polymer blends like Phenylbis(2,4,6-trimethylbenzoyl) phosphine oxide, Poly(ethylene glycol) and PDMS in order to be monitored in real time. The setup was performed using a computer controlled precision micropump and a high resolution optical microscope capable of tracking fluids at fast capture. Observation and analysis were performed using a real time software that reconstructs the fluid dynamics determining the flux velocity, injection dependency, turbulence and rheology. All experiments were carried out with fully computer controlled equipment. Interactions between substances like water, serum (0.9% sodium chloride and electrolyte with a ratio of 4 ppm) and blood cells were studied at microscale as high as 400nm of resolution and the analysis was performed using a frame-by-frame observation and HD-video capture. These observations lead us to understand the fluid and mixing behavior of the interest substance in the blood stream and to shed a light on the use of implantable devices for drug delivery at arteries with different Endothelial dysfunction. Several substances were tested using the artificial artery system. Initially, Milli-Q water was used as a control substance for the study of the basic fluid dynamics of the artificial artery system. However, serum and other low viscous substances were pumped into the system with the presence of other liquids to study the mixing profiles and behaviors. Finally, mammal blood was used for the final test while serum was injected. Different flow conditions, pumping rates, and time rates were evaluated for the determination of the optimal mixing conditions. Our results suggested the use of a very fine controlled microinjection for better mixing profiles with and approximately rate of 135.000 μm3/s for the administration of drugs inside arteries.

Keywords: artificial artery, drug delivery, microfluidics dynamics, arteriosclerosis

Procedia PDF Downloads 297
4659 Performance Comparison of Thread-Based and Event-Based Web Servers

Authors: Aikaterini Kentroti, Theodore H. Kaskalis

Abstract:

Today, web servers are expected to serve thousands of client requests concurrently within stringent response time limits. In this paper, we evaluate experimentally and compare the performance as well as the resource utilization of popular web servers, which differ in their approach to handle concurrency. More specifically, Central Processing Unit (CPU)- and I/O intensive tests were conducted against the thread-based Apache and Go as well as the event-based Nginx and Node.js under increasing concurrent load. The tests involved concurrent users requesting a term of the Fibonacci sequence (the 10th, 20th, 30th) and the content of a table from the database. The results show that Go achieved the best performance in all benchmark tests. For example, Go reached two times higher throughput than Node.js and five times higher than Apache and Nginx in the 20th Fibonacci term test. In addition, Go had the smallest memory footprint and demonstrated the most efficient resource utilization, in terms of CPU usage. Instead, Node.js had by far the largest memory footprint, consuming up to 90% more memory than Nginx and Apache. Regarding the performance of Apache and Nginx, our findings indicate that Hypertext Preprocessor (PHP) becomes a bottleneck when the servers are requested to respond by performing CPU-intensive tasks under increasing concurrent load.

Keywords: apache, Go, Nginx, node.js, web server benchmarking

Procedia PDF Downloads 99
4658 A Deep Learning Approach for the Predictive Quality of Directional Valves in the Hydraulic Final Test

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

The increasing use of deep learning applications in production is becoming a competitive advantage. Predictive quality enables the assurance of product quality by using data-driven forecasts via machine learning models as a basis for decisions on test results. The use of real Bosch production data along the value chain of hydraulic valves is a promising approach to classifying the leakage of directional valves.

Keywords: artificial neural networks, classification, hydraulics, predictive quality, deep learning

Procedia PDF Downloads 248
4657 Quality and Yield of Potato Seed Tubers as Influenced by Plant Growth Promoting Rhizobacteria

Authors: Muhammad Raqib Rasul, Tavga Sulaiman Rashid

Abstract:

Fertilization increases efficiency and obtains better quality of product recovery in agricultural activities. However, fertilizer consumption increased exponentially throughout the world, causing severe environmental problems. Biofertilizers can be a practical approach to minimize chemical fertilizer sources and ultimately develop soil fertility. This study was carried out to isolate, identify and characterize bacteria from medicinal plant (Rumex tuberosus L. and Verbascum sp.) rhizosphere for in vivo screening. 25 bacterial isolates were isolated and several biochemical tests were performed. Two isolates that were positive for most biochemical tests were chosen for the field experiment. The isolates were identified as Go1 Alcaligenes faecalis (Accession No. OP001725) and T11 (Bacillus sp.) based on the 16S rRNA sequence analysis that was compared with related bacteria in GenBank database using MEGA 6.1. For the field trial isolate GO1 and T11 (separately and mixed), NPK as a positive control was used. Both isolates increased plant height, chlorophyll content, number of tubers, and tuber’s weight. The results demonstrated that these two isolates of bacteria can potentially replace with chemical fertilizers for potato production.

Keywords: biofertilizer, Bacillus subtilis, Alcaligenes faecalis, potato tubers, in vivo screening

Procedia PDF Downloads 103
4656 Experımental Study of Structural Insulated Panel under Lateral Load

Authors: H. Abbasi, K. Sennah

Abstract:

A Structural Insulated Panel (SIP) is a structural element contains of foam insulation core sandwiched between two oriented-strand boards (OSB), plywood boards, steel sheets or fibre cement boards. Superior insulation, exceptional strength and fast insulation are the specifications of a SIP-based structure. There are also many other benefits such as less total construction costs, speed of construction, less expensive HVAC equipment required, favourable energy-efficient mortgages comparing to wood-framed houses. This paper presents the experimental analysis on selected foam-timber SIPs to study their structural behaviour when used as walls in residential construction under lateral loading. The experimental program has also taken several stud panels in order to compare the performance of SIP with conventional wood-frame system. The results of lateral tests performed in this study established a database that can be used further to develop design tables of SIP wall subjected to lateral loading caused by wind or earthquake. A design table for walls subjected to lateral loading was developed. Experimental results proved that the tested SIPs are ‘as good as’ the conventional wood-frame system.

Keywords: structural insulated panel, experimental study, lateral load, design tables

Procedia PDF Downloads 318
4655 Effects of Compensation on Distribution System Technical Losses

Authors: B. Kekezoglu, C. Kocatepe, O. Arikan, Y. Hacialiefendioglu, G. Ucar

Abstract:

One of the significant problems of energy systems is to supply economic and efficient energy to consumers. Therefore studies has been continued to reduce technical losses in the network. In this paper, the technical losses analyzed for a portion of European side of Istanbul MV distribution network for different compensation scenarios by considering real system and load data and results are presented. Investigated system is modeled with CYME Power Engineering Software and optimal capacity placement has been proposed to minimize losses.

Keywords: distribution system, optimal capacitor placement, reactive power compensation, technical losses

Procedia PDF Downloads 674
4654 Riesz Mixture Model for Brain Tumor Detection

Authors: Mouna Zitouni, Mariem Tounsi

Abstract:

This research introduces an application of the Riesz mixture model for medical image segmentation for accurate diagnosis and treatment of brain tumors. We propose a pixel classification technique based on the Riesz distribution, derived from an extended Bartlett decomposition. To our knowledge, this is the first study addressing this approach. The Expectation-Maximization algorithm is implemented for parameter estimation. A comparative analysis, using both synthetic and real brain images, demonstrates the superiority of the Riesz model over a recent method based on the Wishart distribution.

Keywords: EM algorithm, segmentation, Riesz probability distribution, Wishart probability distribution

Procedia PDF Downloads 21
4653 Integration of a Microbial Electrolysis Cell and an Oxy-Combustion Boiler

Authors: Ruth Diego, Luis M. Romeo, Antonio Morán

Abstract:

In the present work, a study of the coupling of a Bioelectrochemical System together with an oxy-combustion boiler is carried out; specifically, it proposes to connect the combustion gas outlet of a boiler with a microbial electrolysis cell (MEC) where the CO2 from the gases are transformed into methane in the cathode chamber, and the oxygen produced in the anode chamber is recirculated to the oxy-combustion boiler. The MEC mainly consists of two electrodes (anode and cathode) immersed in an aqueous electrolyte; these electrodes are separated by a proton exchange membrane (PEM). In this case, the anode is abiotic (where oxygen is produced), and it is at the cathode that an electroactive biofilm is formed with microorganisms that catalyze the CO2 reduction reactions. Real data from an oxy-combustion process in a boiler of around 20 thermal MW have been used for this study and are combined with data obtained on a smaller scale (laboratory-pilot scale) to determine the yields that could be obtained considering the system as environmentally sustainable energy storage. In this way, an attempt is made to integrate a relatively conventional energy production system (oxy-combustion) with a biological system (microbial electrolysis cell), which is a challenge to be addressed in this type of new hybrid scheme. In this way, a novel concept is presented with the basic dimensioning of the necessary equipment and the efficiency of the global process. In this work, it has been calculated that the efficiency of this power-to-gas system based on MEC cells when coupled to industrial processes is of the same order of magnitude as the most promising equivalent routes. The proposed process has two main limitations, the overpotentials in the electrodes that penalize the overall efficiency and the need for storage tanks for the process gases. The results of the calculations carried out in this work show that certain real potentials achieve an acceptable performance. Regarding the tanks, with adequate dimensioning, it is possible to achieve complete autonomy. The proposed system called OxyMES provides energy storage without energetically penalizing the process when compared to an oxy-combustion plant with conventional CO2 capture. According to the results obtained, this system can be applied as a measure to decarbonize an industry, changing the original fuel of the oxy-combustion boiler to the biogas generated in the MEC cell. It could also be used to neutralize CO2 emissions from industry by converting it to methane and then injecting it into the natural gas grid.

Keywords: microbial electrolysis cells, oxy-combustion, co2, power-to-gas

Procedia PDF Downloads 109
4652 A Fourier Method for Risk Quantification and Allocation of Credit Portfolios

Authors: Xiaoyu Shen, Fang Fang, Chujun Qiu

Abstract:

Herewith we present a Fourier method for credit risk quantification and allocation in the factor-copula model framework. The key insight is that, compared to directly computing the cumulative distribution function of the portfolio loss via Monte Carlo simulation, it is, in fact, more efficient to calculate the transformation of the distribution function in the Fourier domain instead and inverting back to the real domain can be done in just one step and semi-analytically, thanks to the popular COS method (with some adjustments). We also show that the Euler risk allocation problem can be solved in the same way since it can be transformed into the problem of evaluating a conditional cumulative distribution function. Once the conditional or unconditional cumulative distribution function is known, one can easily calculate various risk metrics. The proposed method not only fills the niche in literature, to the best of our knowledge, of accurate numerical methods for risk allocation but may also serve as a much faster alternative to the Monte Carlo simulation method for risk quantification in general. It can cope with various factor-copula model choices, which we demonstrate via examples of a two-factor Gaussian copula and a two-factor Gaussian-t hybrid copula. The fast error convergence is proved mathematically and then verified by numerical experiments, in which Value-at-Risk, Expected Shortfall, and conditional Expected Shortfall are taken as examples of commonly used risk metrics. The calculation speed and accuracy are tested to be significantly superior to the MC simulation for real-sized portfolios. The computational complexity is, by design, primarily driven by the number of factors instead of the number of obligors, as in the case of Monte Carlo simulation. The limitation of this method lies in the "curse of dimension" that is intrinsic to multi-dimensional numerical integration, which, however, can be relaxed with the help of dimension reduction techniques and/or parallel computing, as we will demonstrate in a separate paper. The potential application of this method has a wide range: from credit derivatives pricing to economic capital calculation of the banking book, default risk charge and incremental risk charge computation of the trading book, and even to other risk types than credit risk.

Keywords: credit portfolio, risk allocation, factor copula model, the COS method, Fourier method

Procedia PDF Downloads 169
4651 Cloud Computing: Major Issues and Solutions

Authors: S. Adhirai Subramaniyam, Paramjit Singh

Abstract:

This paper presents major issues in cloud computing. The paper describes different cloud computing deployment models and cloud service models available in the field of cloud computing. The paper then concentrates on various issues in the field. The issues such as cloud compatibility, compliance of the cloud, standardizing cloud technology, monitoring while on the cloud and cloud security are described. The paper suggests solutions for these issues and concludes that hybrid cloud infrastructure is a real boon for organizations.

Keywords: cloud, cloud computing, mobile cloud computing, private cloud, public cloud, hybrid cloud, SAAS, PAAS, IAAS, cloud security

Procedia PDF Downloads 345
4650 An Alternative Concept of Green Screen Keying

Authors: Jin Zhi

Abstract:

This study focuses on a green screen keying method developed especially for film visual effects. There are a series of ways of using existing tools for creating mattes from green or blue screen plates. However, it is still a time-consuming process, and the results vary especially when it comes to retaining tiny details, such as hair and fur. This paper introduces an alternative concept and method for retaining edge details of characters on a green screen plate, also, a number of connected mathematical equations are explored. At the end of this study, a simplified process of applying this method in real productions is also introduced.

Keywords: green screen, visual effects, compositing, matte

Procedia PDF Downloads 405
4649 Forecasting of Grape Juice Flavor by Using Support Vector Regression

Authors: Ren-Jieh Kuo, Chun-Shou Huang

Abstract:

The research of juice flavor forecasting has become more important in China. Due to the fast economic growth in China, many different kinds of juices have been introduced to the market. If a beverage company can understand their customers’ preference well, the juice can be served more attractively. Thus, this study intends to introduce the basic theory and computing process of grapes juice flavor forecasting based on support vector regression (SVR). Applying SVR, BPN and LR to forecast the flavor of grapes juice in real data, the result shows that SVR is more suitable and effective at predicting performance.

Keywords: flavor forecasting, artificial neural networks, Support Vector Regression, China

Procedia PDF Downloads 494
4648 Biomedical Definition Extraction Using Machine Learning with Synonymous Feature

Authors: Jian Qu, Akira Shimazu

Abstract:

OOV (Out Of Vocabulary) terms are terms that cannot be found in many dictionaries. Although it is possible to translate such OOV terms, the translations do not provide any real information for a user. We present an OOV term definition extraction method by using information available from the Internet. We use features such as occurrence of the synonyms and location distances. We apply machine learning method to find the correct definitions for OOV terms. We tested our method on both biomedical type and name type OOV terms, our work outperforms existing work with an accuracy of 86.5%.

Keywords: information retrieval, definition retrieval, OOV (out of vocabulary), biomedical information retrieval

Procedia PDF Downloads 496
4647 Towards Human-Interpretable, Automated Learning of Feedback Control for the Mixing Layer

Authors: Hao Li, Guy Y. Cornejo Maceda, Yiqing Li, Jianguo Tan, Marek Morzynski, Bernd R. Noack

Abstract:

We propose an automated analysis of the flow control behaviour from an ensemble of control laws and associated time-resolved flow snapshots. The input may be the rich database of machine learning control (MLC) optimizing a feedback law for a cost function in the plant. The proposed methodology provides (1) insights into the control landscape, which maps control laws to performance, including extrema and ridge-lines, (2) a catalogue of representative flow states and their contribution to cost function for investigated control laws and (3) visualization of the dynamics. Key enablers are classification and feature extraction methods of machine learning. The analysis is successfully applied to the stabilization of a mixing layer with sensor-based feedback driving an upstream actuator. The fluctuation energy is reduced by 26%. The control replaces unforced Kelvin-Helmholtz vortices with subsequent vortex pairing by higher-frequency Kelvin-Helmholtz structures of lower energy. These efforts target a human interpretable, fully automated analysis of MLC identifying qualitatively different actuation regimes, distilling corresponding coherent structures, and developing a digital twin of the plant.

Keywords: machine learning control, mixing layer, feedback control, model-free control

Procedia PDF Downloads 226
4646 Community Structure Detection in Networks Based on Bee Colony

Authors: Bilal Saoud

Abstract:

In this paper, we propose a new method to find the community structure in networks. Our method is based on bee colony and the maximization of modularity to find the community structure. We use a bee colony algorithm to find the first community structure that has a good value of modularity. To improve the community structure, that was found, we merge communities until we get a community structure that has a high value of modularity. We provide a general framework for implementing our approach. We tested our method on computer-generated and real-world networks with a comparison to very known community detection methods. The obtained results show the effectiveness of our proposition.

Keywords: bee colony, networks, modularity, normalized mutual information

Procedia PDF Downloads 409
4645 The Evolution of Architecture through Digital: A Survey on Fashion Catwalk Becoming Digital

Authors: Valeria Minucciani, Maria Maddalena Margaria

Abstract:

While mathematical tools that make digital architecture possible are very sophisticated and advanced, theoretical development of digital architecture (intended as a discipline that integrates or replaces the real architecture) is not. The fashion show, that involves interiors architecture, exhibit design and scenography, has been exploiting for ten years the opportunities offered by digital technologies. To gain greater visibility and to reach a wider audience, high-level experimentations have been performed. The aim of this paper is in investigating, through the analysis of some cases of virtual fashion shows, the 'architectural' impact of the virtual conception of interior space.

Keywords: digital interiors, exhibit, fashion catwalk, architectural theory

Procedia PDF Downloads 428
4644 Bionaut™: A Breakthrough Robotic Microdevice to Treat Non-Communicating Hydrocephalus in Both Adult and Pediatric Patients

Authors: Suehyun Cho, Darrell Harrington, Florent Cros, Olin Palmer, John Caputo, Michael Kardosh, Eran Oren, William Loudon, Alex Kiselyov, Michael Shpigelmacher

Abstract:

Bionaut Labs, LLC is developing a minimally invasive robotic microdevice designed to treat non-communicating hydrocephalus in both adult and pediatric patients. The device utilizes biocompatible microsurgical particles (Bionaut™) that are specifically designed to safely and reliably perform accurate fenestration(s) in the 3rd ventricle, aqueduct of Sylvius, and/or trapped intraventricular cysts of the brain in order to re-establish normal cerebrospinal fluid flow dynamics and thereby balance and/or normalize intra/intercompartmental pressure. The Bionaut™ is navigated to the target via CSF or brain tissue in a minimally invasive fashion with precise control using real-time imaging. Upon reaching the pre-defined anatomical target, the external driver allows for directing the specific microsurgical action defined to achieve the surgical goal. Notable features of the proposed protocol are i) Bionaut™ access to the intraventricular target follows a clinically validated endoscopy trajectory which may not be feasible via ‘traditional’ rigid endoscopy: ii) the treatment is microsurgical, there are no foreign materials left behind post-procedure; iii) Bionaut™ is an untethered device that is navigated through the subarachnoid and intraventricular compartments of the brain, following pre-designated non-linear trajectories as determined by the safest anatomical and physiological path; iv) Overall protocol involves minimally invasive delivery and post-operational retrieval of the surgical Bionaut™. The approach is expected to be suitable to treat pediatric patients 0-12 months old as well as adult patients with obstructive hydrocephalus who fail traditional shunts or are eligible for endoscopy. Current progress, including platform optimization, Bionaut™ control, and real-time imaging and in vivo safety studies of the Bionauts™ in large animals, specifically the spine and the brain of ovine models, will be discussed.

Keywords: Bionaut™, cerebrospinal fluid, CSF, fenestration, hydrocephalus, micro-robot, microsurgery

Procedia PDF Downloads 172
4643 Data Mining of Students' Performance Using Artificial Neural Network: Turkish Students as a Case Study

Authors: Samuel Nii Tackie, Oyebade K. Oyedotun, Ebenezer O. Olaniyi, Adnan Khashman

Abstract:

Artificial neural networks have been used in different fields of artificial intelligence, and more specifically in machine learning. Although, other machine learning options are feasible in most situations, but the ease with which neural networks lend themselves to different problems which include pattern recognition, image compression, classification, computer vision, regression etc. has earned it a remarkable place in the machine learning field. This research exploits neural networks as a data mining tool in predicting the number of times a student repeats a course, considering some attributes relating to the course itself, the teacher, and the particular student. Neural networks were used in this work to map the relationship between some attributes related to students’ course assessment and the number of times a student will possibly repeat a course before he passes. It is the hope that the possibility to predict students’ performance from such complex relationships can help facilitate the fine-tuning of academic systems and policies implemented in learning environments. To validate the power of neural networks in data mining, Turkish students’ performance database has been used; feedforward and radial basis function networks were trained for this task; and the performances obtained from these networks evaluated in consideration of achieved recognition rates and training time.

Keywords: artificial neural network, data mining, classification, students’ evaluation

Procedia PDF Downloads 615
4642 Rapid, Direct, Real-Time Method for Bacteria Detection on Surfaces

Authors: Evgenia Iakovleva, Juha Koivisto, Pasi Karppinen, J. Inkinen, Mikko Alava

Abstract:

Preventing the spread of infectious diseases throughout the worldwide is one of the most important tasks of modern health care. Infectious diseases not only account for one fifth of the deaths in the world, but also cause many pathological complications for the human health. Touch surfaces pose an important vector for the spread of infections by varying microorganisms, including antimicrobial resistant organisms. Further, antimicrobial resistance is reply of bacteria to the overused or inappropriate used of antibiotics everywhere. The biggest challenges in bacterial detection by existing methods are non-direct determination, long time of analysis, the sample preparation, use of chemicals and expensive equipment, and availability of qualified specialists. Therefore, a high-performance, rapid, real-time detection is demanded in rapid practical bacterial detection and to control the epidemiological hazard. Among the known methods for determining bacteria on the surfaces, Hyperspectral methods can be used as direct and rapid methods for microorganism detection on different kind of surfaces based on fluorescence without sampling, sample preparation and chemicals. The aim of this study was to assess the relevance of such systems to remote sensing of surfaces for microorganisms detection to prevent a global spread of infectious diseases. Bacillus subtilis and Escherichia coli with different concentrations (from 0 to 10x8 cell/100µL) were detected with hyperspectral camera using different filters as visible visualization of bacteria and background spots on the steel plate. A method of internal standards was applied for monitoring the correctness of the analysis results. Distances from sample to hyperspectral camera and light source are 25 cm and 40 cm, respectively. Each sample is optically imaged from the surface by hyperspectral imaging system, utilizing a JAI CM-140GE-UV camera. Light source is BeamZ FLATPAR DMX Tri-light, 3W tri-colour LEDs (red, blue and green). Light colors are changed through DMX USB Pro interface. The developed system was calibrated following a standard procedure of setting exposure and focused for light with λ=525 nm. The filter is ThorLabs KuriousTM hyperspectral filter controller with wavelengths from 420 to 720 nm. All data collection, pro-processing and multivariate analysis was performed using LabVIEW and Python software. The studied human eye visible and invisible bacterial stains clustered apart from a reference steel material by clustering analysis using different light sources and filter wavelengths. The calculation of random and systematic errors of the analysis results proved the applicability of the method in real conditions. Validation experiments have been carried out with photometry and ATP swab-test. The lower detection limit of developed method is several orders of magnitude lower than for both validation methods. All parameters of the experiments were the same, except for the light. Hyperspectral imaging method allows to separate not only bacteria and surfaces, but also different types of bacteria, such as Gram-negative Escherichia coli and Gram-positive Bacillus subtilis. Developed method allows skipping the sample preparation and the use of chemicals, unlike all other microbiological methods. The time of analysis with novel hyperspectral system is a few seconds, which is innovative in the field of microbiological tests.

Keywords: Escherichia coli, Bacillus subtilis, hyperspectral imaging, microorganisms detection

Procedia PDF Downloads 229
4641 Study on Improvement the Performance of Construction Project Using Lean Principles

Authors: Sumaya Adina

Abstract:

The productivity of the construction industry has faced numerous challenges, rising costs, and scarce resources over the past forty years; therefore, one approach for improving and enhancing the framework is the use of lean techniques. Lean method outcomes from the use of a brand-form of manufacturing control in production. At a time when sustainability and efficiency are essential, lean offers a clear path to make the construction industry fit for the future. An excessive number of construction professionals and experts have efficiently optimised development initiatives using lean construction (LC) techniques to reduce waste, maximise value creation, and focus on the process that creates real added value and continuous improvement, strengthening flexibility and adaptability. The present research has been undertaken to study the improvement in the performance of construction projects using lean principles. The study work is divided into three stages. Initially, a questionnaire survey was conducted on visual management techniques to improve the performance of the construction projects. The questionnaire was distributed to civil engineers, architects, site managers, project managers, and full-time executives, with nearly 100 questionnaires shared with respondents. A total of 83 responses were received to determine the reliability of the data, and analysis was done using SPSS software. In the second stage, the impact of value stream mapping on the real-time project is determined and its performance in the form of time and cost reduction is evaluated. The case study examines a bunker-building project located in Kabul Afghanistan; the project was planned conventionally without considering the lean concepts. To reduce overall kinds of waste in the project, a plan was developed using the Vico Control software to visualize the value stream of the project. Finally, the impact of value stream mapping on the project's total cash flow is evaluated and compared by plotting the total cash flow curve using Vico software. As a result, labour costs were reduced by 33%. The duration of the project was reduced by 17% reducing the duration of the project also improved the cash flow of the entire project by 14% and increased the cash flow from negative 67% to negative 44%.

Keywords: lean construction, cost and time overrun, value stream mapping, construction effeciency

Procedia PDF Downloads 9
4640 Spatio-Temporal Data Mining with Association Rules for Lake Van

Authors: Tolga Aydin, M. Fatih Alaeddinoğlu

Abstract:

People, throughout the history, have made estimates and inferences about the future by using their past experiences. Developing information technologies and the improvements in the database management systems make it possible to extract useful information from knowledge in hand for the strategic decisions. Therefore, different methods have been developed. Data mining by association rules learning is one of such methods. Apriori algorithm, one of the well-known association rules learning algorithms, is not commonly used in spatio-temporal data sets. However, it is possible to embed time and space features into the data sets and make Apriori algorithm a suitable data mining technique for learning spatio-temporal association rules. Lake Van, the largest lake of Turkey, is a closed basin. This feature causes the volume of the lake to increase or decrease as a result of change in water amount it holds. In this study, evaporation, humidity, lake altitude, amount of rainfall and temperature parameters recorded in Lake Van region throughout the years are used by the Apriori algorithm and a spatio-temporal data mining application is developed to identify overflows and newly-formed soil regions (underflows) occurring in the coastal parts of Lake Van. Identifying possible reasons of overflows and underflows may be used to alert the experts to take precautions and make the necessary investments.

Keywords: apriori algorithm, association rules, data mining, spatio-temporal data

Procedia PDF Downloads 375
4639 Effects of Macroprudential Policies on BankLending and Risks

Authors: Stefanie Behncke

Abstract:

This paper analyses the effects of different macroprudential policy measures that have recently been implemented in Switzerland. Among them is the activation and the increase of the countercyclical capital buffer (CCB) and a tightening of loan-to-value (LTV) requirements. These measures were introduced to limit systemic risks in the Swiss mortgage and real estate markets. They were meant to affect mortgage growth, mortgage risks, and banks’ capital buffers. Evaluation of their quantitative effects provides insights for Swiss policymakers when reassessing their policy. It is also informative for policymakers in other countries who plan to introduce macroprudential instruments. We estimate the effects of the different macroprudential measures with a Differences-in-Differences estimator. Banks differ with respect to the relative importance of mortgages in their portfolio, their riskiness, and their capital buffers. Thus, some of the banks were more affected than others by the CCB, while others were more affected by the LTV requirements. Our analysis is made possible by an unusually informative bank panel data set. It combines data on newly issued mortgage loans and quantitative risk indicators such as LTV and loan-to-income (LTI) ratios with supervisory information on banks’ capital and liquidity situation and balance sheets. Our results suggest that the LTV cap of 90% was most effective. The proportion of new mortgages with a high LTV ratio was significantly reduced. This result does not only apply to the 90% LTV, but also to other threshold values (e.g. 80%, 75%) suggesting that the entire upper part of the LTV distribution was affected. Other outcomes such as the LTI distribution, the growth rates of mortgages and other credits, however, were not significantly affected. Regarding the activation and the increase of the CCB, we do not find any significant effects: neither LTV/LTI risk parameters nor mortgage and other credit growth rates were significantly reduced. This result may reflect that the size of the CCB (1% of relevant residential real estate risk-weighted assets at activation, respectively 2% at the increase) was not sufficiently high enough to trigger a distinct reaction between the banks most likely to be affected by the CCB and those serving as controls. Still, it might be have been effective in increasing the resilience in the overall banking system. From a policy perspective, these results suggest that targeted macroprudential policy measures can contribute to financial stability. In line with findings by others, caps on LTV reduced risk taking in Switzerland. To fully assess the effectiveness of the CCB, further experience is needed.

Keywords: banks, financial stability, macroprudential policy, mortgages

Procedia PDF Downloads 362
4638 Comparative Advantage of Mobile Agent Application in Procuring Software Products on the Internet

Authors: Michael K. Adu, Boniface K. Alese, Olumide S. Ogunnusi

Abstract:

This paper brings to fore the inherent advantages in application of mobile agents to procure software products rather than downloading software content on the Internet. It proposes a system whereby the products come on compact disk with mobile agent as deliverable. The client/user purchases a software product, but must connect to the remote server of the software developer before installation. The user provides an activation code that activates mobile agent which is part of the software product on compact disk. The validity of the activation code is checked on connection at the developer’s end to ascertain authenticity and prevent piracy. The system is implemented by downloading two different software products as compare with installing same products on compact disk with mobile agent’s application. Downloading software contents from developer’s database as in the traditional method requires a continuously open connection between the client and the developer’s end, a fixed network is not economically or technically feasible. Mobile agent after being dispatched into the network becomes independent of the creating process and can operate asynchronously and autonomously. It can reconnect later after completing its task and return for result delivery. Response Time and Network Load are very minimal with application of Mobile agent.

Keywords: software products, software developer, internet, activation code, mobile agent

Procedia PDF Downloads 312
4637 Sensitive Detection of Nano-Scale Vibrations by the Metal-Coated Fiber Tip at the Liquid-Air Interface

Authors: A. J. Babajanyan, T. A. Abrahamyan, H. A. Minasyan, K. V. Nerkararyan

Abstract:

Optical radiation emitted from a metal-coated fiber tip apex at liquid-air interface was measured. The intensity of the output radiation was strongly depending on the relative position of the tip to a liquid-air interface and varied with surface fluctuations. This phenomenon permits in-situ real-time investigation of nano-metric vibrations of the liquid surface and provides a basis for development of various origin ultrasensitive vibration detecting sensors. The described method can be used for detection of week seismic vibrations.

Keywords: fiber-tip, liquid-air interface, nano vibration, opto-mechanical sensor

Procedia PDF Downloads 484
4636 Social Economic Factors Associated with the Nutritional Status of Children In Western Uganda

Authors: Baguma Daniel Kajura

Abstract:

The study explores socio-economic factors, health related and individual factors that influence the breastfeeding habits of mothers and their effect on the nutritional status of their infants in the Rwenzori region of Western Uganda. A cross-sectional research design was adopted, and it involved the use of self-administered questionnaires, interview guides, and focused group discussion guides to assess the extent to which socio-demographic factors associated with breastfeeding practices influence child malnutrition. Using this design, data was collected from 276 mother-paired infants out of the selected 318 mother-paired infants over a period of ten days. Using a sample size formula by Kish Leslie for cross-sectional studies N= Zα2 P (1- P) / δ2, where N= sample size estimate of paired mother paired infants. P= assumed true population prevalence of mother–paired infants with malnutrition cases, P = 29.3%. 1-P = the probability of mother-paired infants not having malnutrition, so 1-P = 70.7% Zα = Standard normal deviation at 95% confidence interval corresponding to 1.96.δ = Absolute error between the estimated and true population prevalence of malnutrition of 5%. The calculated sample size N = 1.96 × 1.96 (0.293 × 0.707) /0,052= 318 mother paired infants. Demographic and socio-economic data for all mothers were entered into Microsoft Excel software and then exported to STATA 14 (StataCorp, 2015). Anthropometric measurements were taken for all children by the researcher and the trained assistants who physically weighed the children. The use of immunization card was used to attain the age of the child. The bivariate logistic regression analysis was used to assess the relationship between socio-demographic factors associated with breastfeeding practices and child malnutrition. The multivariable regression analysis was used to draw a conclusion on whether or not there are any true relationships between the socio-demographic factors associated with breastfeeding practices as independent variables and child stunting and underweight as dependent variables in relation to breastfeeding practices. Descriptive statistics on background characteristics of the mothers were generated and presented in frequency distribution tables. Frequencies and means were computed, and the results were presented using tables, then, we determined the distribution of stunting and underweight among infants by the socioeconomic and demographic factors. Findings reveal that children of mothers who used milk substitutes besides breastfeeding are over two times more likely to be stunted compared to those whose mothers exclusively breastfed them. Feeding children with milk substitutes instead of breastmilk predisposes them to both stunting and underweight. Children of mothers between 18 and 34 years of age are less likely to be underweight, as were those who were breastfed over ten times a day. The study further reveals that 55% of the children were underweight, and 49% were stunted. Of the underweight children, an equal number (58/151) were either mildly or moderately underweight (38%), and 23% (35/151) were severely underweight. Empowering community outreach programs by increasing knowledge and increased access to services on integrated management of child malnutrition is crucial to curbing child malnutrition in rural areas.

Keywords: infant and young child feeding, breastfeeding, child malnutrition, maternal health

Procedia PDF Downloads 24
4635 Productivity and Household Welfare Impact of Technology Adoption: A Microeconometric Analysis

Authors: Tigist Mekonnen Melesse

Abstract:

Since rural households are basically entitled to food through own production, improving productivity might lead to enhance the welfare of rural population through higher food availability at the household level and lowering the price of agricultural products. Increasing agricultural productivity through the use of improved technology is one of the desired outcomes from sensible food security and agricultural policy. The ultimate objective of this study was to evaluate the potential impact of improved agricultural technology adoption on smallholders’ crop productivity and welfare. The study is conducted in Ethiopia covering 1500 rural households drawn from four regions and 15 rural villages based on data collected by Ethiopian Rural Household Survey. Endogenous treatment effect model is employed in order to account for the selection bias on adoption decision that is expected from the self-selection of households in technology adoption. The treatment indicator, technology adoption is a binary variable indicating whether the household used improved seeds and chemical fertilizer or not. The outcome variables were cereal crop productivity, measured in real value of production and welfare of households, measured in real per capita consumption expenditure. Results of the analysis indicate that there is positive and significant effect of improved technology use on rural households’ crop productivity and welfare in Ethiopia. Adoption of improved seeds and chemical fertilizer alone will increase the crop productivity by 7.38 and 6.32 percent per year of each. Adoption of such technologies is also found to improve households’ welfare by 1.17 and 0.25 percent per month of each. The combined effect of both technologies when adopted jointly is increasing crop productivity by 5.82 percent and improving welfare by 0.42 percent. Besides, educational level of household head, farm size, labor use, participation in extension program, expenditure for input and number of oxen positively affect crop productivity and household welfare, while large household size negatively affect welfare of households. In our estimation, the average treatment effect of technology adoption (average treatment effect on the treated, ATET) is the same as the average treatment effect (ATE). This implies that the average predicted outcome for the treatment group is similar to the average predicted outcome for the whole population.

Keywords: Endogenous treatment effect, technologies, productivity, welfare, Ethiopia

Procedia PDF Downloads 656
4634 Clinical Feature Analysis and Prediction on Recurrence in Cervical Cancer

Authors: Ravinder Bahl, Jamini Sharma

Abstract:

The paper demonstrates analysis of the cervical cancer based on a probabilistic model. It involves technique for classification and prediction by recognizing typical and diagnostically most important test features relating to cervical cancer. The main contributions of the research include predicting the probability of recurrences in no recurrence (first time detection) cases. The combination of the conventional statistical and machine learning tools is applied for the analysis. Experimental study with real data demonstrates the feasibility and potential of the proposed approach for the said cause.

Keywords: cervical cancer, recurrence, no recurrence, probabilistic, classification, prediction, machine learning

Procedia PDF Downloads 360
4633 STC Parameters versus Real Time Measured Parameters to Determine Cost Effectiveness of PV Panels

Authors: V. E. Selaule, R. M. Schoeman H. C. Z. Pienaar

Abstract:

Research has shown that solar energy is a renewable energy resource with the most potential when compared to other renewable energy resources in South Africa. There are many makes of Photovoltaic (PV) panels on the market and it is difficult to assess which to use. PV panel manufacturers use Standard Test Conditions (STC) to rate their PV panels. STC conditions are different from the actual operating environmental conditions were the PV panels are used. This paper describes a practical method to determine the most cost effective available PV panel. The method shows that PV panel manufacturer STC ratings cannot be used to select a cost effective PV panel.

Keywords: PV orientation, PV panel, PV STC, Solar energy

Procedia PDF Downloads 474
4632 AIR SAFE: an Internet of Things System for Air Quality Management Leveraging Artificial Intelligence Algorithms

Authors: Mariangela Viviani, Daniele Germano, Simone Colace, Agostino Forestiero, Giuseppe Papuzzo, Sara Laurita

Abstract:

Nowadays, people spend most of their time in closed environments, in offices, or at home. Therefore, secure and highly livable environmental conditions are needed to reduce the probability of aerial viruses spreading. Also, to lower the human impact on the planet, it is important to reduce energy consumption. Heating, Ventilation, and Air Conditioning (HVAC) systems account for the major part of energy consumption in buildings [1]. Devising systems to control and regulate the airflow is, therefore, essential for energy efficiency. Moreover, an optimal setting for thermal comfort and air quality is essential for people’s well-being, at home or in offices, and increases productivity. Thanks to the features of Artificial Intelligence (AI) tools and techniques, it is possible to design innovative systems with: (i) Improved monitoring and prediction accuracy; (ii) Enhanced decision-making and mitigation strategies; (iii) Real-time air quality information; (iv) Increased efficiency in data analysis and processing; (v) Advanced early warning systems for air pollution events; (vi) Automated and cost-effective m onitoring network; and (vii) A better understanding of air quality patterns and trends. We propose AIR SAFE, an IoT-based infrastructure designed to optimize air quality and thermal comfort in indoor environments leveraging AI tools. AIR SAFE employs a network of smart sensors collecting indoor and outdoor data to be analyzed in order to take any corrective measures to ensure the occupants’ wellness. The data are analyzed through AI algorithms able to predict the future levels of temperature, relative humidity, and CO₂ concentration [2]. Based on these predictions, AIR SAFE takes actions, such as opening/closing the window or the air conditioner, to guarantee a high level of thermal comfort and air quality in the environment. In this contribution, we present the results from the AI algorithm we have implemented on the first s et o f d ata c ollected i n a real environment. The results were compared with other models from the literature to validate our approach.

Keywords: air quality, internet of things, artificial intelligence, smart home

Procedia PDF Downloads 94