Search results for: applied art
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8212

Search results for: applied art

7072 Parameter Estimation in Dynamical Systems Based on Latent Variables

Authors: Arcady Ponosov

Abstract:

A novel mathematical approach is suggested, which facilitates a compressed representation and efficient validation of parameter-rich ordinary differential equation models describing the dynamics of complex, especially biology-related, systems and which is based on identification of the system's latent variables. In particular, an efficient parameter estimation method for the compressed non-linear dynamical systems is developed. The method is applied to the so-called 'power-law systems' being non-linear differential equations typically used in Biochemical System Theory.

Keywords: generalized law of mass action, metamodels, principal components, synergetic systems

Procedia PDF Downloads 340
7071 The Effectiveness of a Courseware in 7th Grade Chemistry Lesson

Authors: Oguz Ak

Abstract:

In this study a courseware for the learning unit of `Properties of matters` in chemistry course is developed. The courseware is applied to 15 7th grade (about age 14) students in real settings. As a result of the study it is found that the students` grade in the learning unit significantly increased when they study the courseware themselves. In addition, the score improvements of the students who found the courseware is usable is not significantly higher than the score improvements of the students who did not found it usable.

Keywords: computer based instruction, effect of courseware and usability of courseware, 7th grade

Procedia PDF Downloads 447
7070 Optical Imaging Based Detection of Solder Paste in Printed Circuit Board Jet-Printing Inspection

Authors: D. Heinemann, S. Schramm, S. Knabner, D. Baumgarten

Abstract:

Purpose: Applying solder paste to printed circuit boards (PCB) with stencils has been the method of choice over the past years. A new method uses a jet printer to deposit tiny droplets of solder paste through an ejector mechanism onto the board. This allows for more flexible PCB layouts with smaller components. Due to the viscosity of the solder paste, air blisters can be trapped in the cartridge. This can lead to missing solder joints or deviations in the applied solder volume. Therefore, a built-in and real-time inspection of the printing process is needed to minimize uncertainties and increase the efficiency of the process by immediate correction. The objective of the current study is the design of an optimal imaging system and the development of an automatic algorithm for the detection of applied solder joints from optical from the captured images. Methods: In a first approach, a camera module connected to a microcomputer and LED strips are employed to capture images of the printed circuit board under four different illuminations (white, red, green and blue). Subsequently, an improved system including a ring light, an objective lens, and a monochromatic camera was set up to acquire higher quality images. The obtained images can be divided into three main components: the PCB itself (i.e., the background), the reflections induced by unsoldered positions or screw holes and the solder joints. Non-uniform illumination is corrected by estimating the background using a morphological opening and subtraction from the input image. Image sharpening is applied in order to prevent error pixels in the subsequent segmentation. The intensity thresholds which divide the main components are obtained from the multimodal histogram using three probability density functions. Determining the intersections delivers proper thresholds for the segmentation. Remaining edge gradients produces small error areas which are removed by another morphological opening. For quantitative analysis of the segmentation results, the dice coefficient is used. Results: The obtained PCB images show a significant gradient in all RGB channels, resulting from ambient light. Using different lightings and color channels 12 images of a single PCB are available. A visual inspection and the investigation of 27 specific points show the best differentiation between those points using a red lighting and a green color channel. Estimating two thresholds from analyzing the multimodal histogram of the corrected images and using them for segmentation precisely extracts the solder joints. The comparison of the results to manually segmented images yield high sensitivity and specificity values. Analyzing the overall result delivers a Dice coefficient of 0.89 which varies for single object segmentations between 0.96 for a good segmented solder joints and 0.25 for single negative outliers. Conclusion: Our results demonstrate that the presented optical imaging system and the developed algorithm can robustly detect solder joints on printed circuit boards. Future work will comprise a modified lighting system which allows for more precise segmentation results using structure analysis.

Keywords: printed circuit board jet-printing, inspection, segmentation, solder paste detection

Procedia PDF Downloads 321
7069 Development and Effects of Transtheoretical Model Exercise Program for Elderly Women with Chronic Back Pain

Authors: Hyun-Ju Oh, Soon-Rim Suh, Mihan Kim

Abstract:

The steady and rapid increase of the older population is a global phenomenon. Chronic diseases and disabilities are increased due to aging. In general, exercise has been known to be most effective in preventing and managing chronic back pain. However, it is hard for the older women to initiate and maintain the exercise. Transtheoretical model (TTM) is one of the theories explain behavioral changes such as exercise. The application of the program considering the stage of behavior change is effective for the elderly woman to start and maintain the exercise. The purpose of this study was to develop TTM based exercise program and to examine its effect for elderly women with chronic back-pain. For the program evaluation, the non-equivalent control pre-posttest design was applied. The independent variable of this study is exercise intervention program. The contents of the program were constructed considering the characteristics of the elderly women with chronic low back pain, focusing on the process of change, the stage of change by the previous studies. The developed exercise program was applied to the elderly women with chronic low back pain in the planning stage and the preparation stage. The subjects were 50 older women over 65 years of age with chronic back-pain who did not practice regular exercise. The experimental group (n=25) received the 8weeks TTM based exercise program. The control group received the book which named low back pain management. Data were collected at three times: before the exercise intervention, right after the intervention, and 4weeks after the intervention. The dependent variables were the processes of change, decisional balance, exercise self-efficacy, back-pain, depression and muscle strength. The results of this study were as follows. Processes of change (<.001), pros of decisional balance (<.001), exercise self-efficacy (<.001), back pain (<.001), depression (<.001), muscle strength (<.001) were higher in the experimental group than in the control group right after the program and 4weeks after the programs. The results of this study show that applying the TTM based exercise program increases the use of the change process, increases the exercise self-efficacy, increases the stage of changing the exercise behavior and strengthens the muscular strength by lowering the degree of pain and depression Respectively. The significance of the study was to confirm the effect of continuous exercise by maintaining regular exercise habits by applying exercise program of the transtheoretical model to the chronic low back pain elderly with exercise intention.

Keywords: chronic back pain, elderly, exercise, women

Procedia PDF Downloads 243
7068 Literature Review and Evaluation of the Internal Marketing Theory

Authors: Hsiao Hsun Yuan

Abstract:

Internal marketing was proposed in 1970s. The theory of the concept has continually changed over the past forty years. This study discussed the following themes: the definition and implication of internal marketing, the progress of its development, and the evolution of its theoretical model. Moreover, the study systematically organized the strategies of the internal marketing theory adopted on enterprise and how they were put into practice. It also compared the empirical studies focusing on how the existent theories influenced the important variables of internal marketing. The results of this study are expected to serve as references for future exploration of the boundary and studies aiming at how internal marketing is applied to different types of enterprises.

Keywords: corporate responsibility, employee organizational performance, internal marketing, internal customer

Procedia PDF Downloads 337
7067 Optical Whitening of Textiles: Teaching and Learning Materials

Authors: C. W. Kan

Abstract:

This study examines the results of optical whitening process of different textiles such as cotton, wool and polyester. The optical whitening agents used are commercially available products, and the optical whitening agents were applied to the textiles with manufacturers’ suggested methods. The aim of this study is to illustrate the proper application methods of optical whitening agent to different textiles and hence to provide guidance note to the students in learning this topic. Acknowledgment: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.

Keywords: learning materials, optical whitening agent, wool, cotton, polyester

Procedia PDF Downloads 407
7066 The Routine Use of a Negative Pressure Incision Management System in Vascular Surgery: A Case Series

Authors: Hansraj Bookun, Angela Tan, Rachel Xuan, Linheng Zhao, Kejia Wang, Animesh Singla, David Kim, Christopher Loupos

Abstract:

Introduction: Incisional wound complications in vascular surgery patients represent a significant clinical and econometric burden of morbidity and mortality. The objective of this study was to trial the feasibility of applying the Prevena negative pressure incision management system as a routine dressing in patients who had undergone arterial surgery. Conventionally, Prevena has been applied to groin incisions, but this study features applications on multiple wound sites such as the thigh or major amputation stumps. Method: This was a cross-sectional observational, single-centre case series of 12 patients who had undergone major vascular surgery. Their wounds were managed with the Prevena system being applied either intra-operatively or on the first post-operative day. Demographic and operative details were collated as well as the length of stay and complication rates. Results: There were 9 males (75%) with mean age of 66 years and the comorbid burden was as follows: ischaemic heart disease (92%), diabetes (42%), hypertension (100%), stage 4 or greater kidney impairment (17%) and current or ex-smoking (83%). The main indications were acute ischaemia (33%), claudication (25%), and gangrene (17%). There were single instances of an occluded popliteal artery aneurysm, diabetic foot infection, and rest pain. The majority of patients (50%) had hybrid operations with iliofemoral endarterectomies, patch arterioplasties, and further peripheral endovascular treatment. There were 4 complex arterial bypass operations and 2 major amputations. The mean length of stay was 17 ± 10 days, with a range of 4 to 35 days. A single complication, in the form of a lymphocoele, was encountered in the context of an iliofemoral endarterectomy and patch arterioplasty. This was managed conservatively. There were no deaths. Discussion: The Prevena wound management system shows that in conjunction with safe vascular surgery, absolute wound complication rates remain low and that it remains a valuable adjunct in the treatment of vasculopaths.

Keywords: wound care, negative pressure, vascular surgery, closed incision

Procedia PDF Downloads 115
7065 Design of a Fuzzy Luenberger Observer for Fault Nonlinear System

Authors: Mounir Bekaik, Messaoud Ramdani

Abstract:

We present in this work a new technique of stabilization for fault nonlinear systems. The approach we adopt focus on a fuzzy Luenverger observer. The T-S approximation of the nonlinear observer is based on fuzzy C-Means clustering algorithm to find local linear subsystems. The MOESP identification approach was applied to design an empirical model describing the subsystems state variables. The gain of the observer is given by the minimization of the estimation error through Lyapunov-krasovskii functional and LMI approach. We consider a three tank hydraulic system for an illustrative example.

Keywords: nonlinear system, fuzzy, faults, TS, Lyapunov-Krasovskii, observer

Procedia PDF Downloads 311
7064 Comparing Test Equating by Item Response Theory and Raw Score Methods with Small Sample Sizes on a Study of the ARTé: Mecenas Learning Game

Authors: Steven W. Carruthers

Abstract:

The purpose of the present research is to equate two test forms as part of a study to evaluate the educational effectiveness of the ARTé: Mecenas art history learning game. The researcher applied Item Response Theory (IRT) procedures to calculate item, test, and mean-sigma equating parameters. With the sample size n=134, test parameters indicated “good” model fit but low Test Information Functions and more acute than expected equating parameters. Therefore, the researcher applied equipercentile equating and linear equating to raw scores and compared the equated form parameters and effect sizes from each method. Item scaling in IRT enables the researcher to select a subset of well-discriminating items. The mean-sigma step produces a mean-slope adjustment from the anchor items, which was used to scale the score on the new form (Form R) to the reference form (Form Q) scale. In equipercentile equating, scores are adjusted to align the proportion of scores in each quintile segment. Linear equating produces a mean-slope adjustment, which was applied to all core items on the new form. The study followed a quasi-experimental design with purposeful sampling of students enrolled in a college level art history course (n=134) and counterbalancing design to distribute both forms on the pre- and posttests. The Experimental Group (n=82) was asked to play ARTé: Mecenas online and complete Level 4 of the game within a two-week period; 37 participants completed Level 4. Over the same period, the Control Group (n=52) did not play the game. The researcher examined between group differences from post-test scores on test Form Q and Form R by full-factorial Two-Way ANOVA. The raw score analysis indicated a 1.29% direct effect of form, which was statistically non-significant but may be practically significant. The researcher repeated the between group differences analysis with all three equating methods. For the IRT mean-sigma adjusted scores, form had a direct effect of 8.39%. Mean-sigma equating with a small sample may have resulted in inaccurate equating parameters. Equipercentile equating aligned test means and standard deviations, but resultant skewness and kurtosis worsened compared to raw score parameters. Form had a 3.18% direct effect. Linear equating produced the lowest Form effect, approaching 0%. Using linearly equated scores, the researcher conducted an ANCOVA to examine the effect size in terms of prior knowledge. The between group effect size for the Control Group versus Experimental Group participants who completed the game was 14.39% with a 4.77% effect size attributed to pre-test score. Playing and completing the game increased art history knowledge, and individuals with low prior knowledge tended to gain more from pre- to post test. Ultimately, researchers should approach test equating based on their theoretical stance on Classical Test Theory and IRT and the respective  assumptions. Regardless of the approach or method, test equating requires a representative sample of sufficient size. With small sample sizes, the application of a range of equating approaches can expose item and test features for review, inform interpretation, and identify paths for improving instruments for future study.

Keywords: effectiveness, equipercentile equating, IRT, learning games, linear equating, mean-sigma equating

Procedia PDF Downloads 180
7063 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 315
7062 Sequential Covering Algorithm for Nondifferentiable Global Optimization Problem and Applications

Authors: Mohamed Rahal, Djaouida Guetta

Abstract:

In this paper, the one-dimensional unconstrained global optimization problem of continuous functions satifying a Hölder condition is considered. We extend the algorithm of sequential covering SCA for Lipschitz functions to a large class of Hölder functions. The convergence of the method is studied and the algorithm can be applied to systems of nonlinear equations. Finally, some numerical examples are presented and illustrate the efficiency of the present approach.

Keywords: global optimization, Hölder functions, sequential covering method, systems of nonlinear equations

Procedia PDF Downloads 352
7061 Legal Means for Access to Information Management

Authors: Sameut Bouhaik Mostafa

Abstract:

Information Act is the Canadian law gives the right of access to information for the institution of government. It declares the availability of government information to the public, but that exceptions should be limited and the necessary right of access to be specific, and also states the need to constantly re-examine the decisions on the disclosure of any government information independently from the government. By 1982, it enacted a dozen countries, including France, Denmark, Finland, Sweden, the Netherlands and the United States (1966) newly legally to access the information. It entered access to Canadian information into force of the Act of 1983, under the government of Pierre Trudeau, allowing Canadians to recover information from government files, and the development of what can be accessed from the information, and the imposition of timetables to respond. It has been applied by the Information Commissioner in Canada.

Keywords: law, information, management, legal

Procedia PDF Downloads 397
7060 Applications of AFM in 4D to Optimize the Design of Genetic Nanoparticles

Authors: Hosam Abdelhady

Abstract:

Filming the behaviors of individual DNA molecules in their environment when they interact with individual medicinal nano-polymers in a molecular scale has opened the door to understand the effect of the molecular shape, size, and incubation time with nanocarriers on optimizing the design of robust genetic Nano molecules able to resist the enzymatic degradation, enter the cell, reach to the nucleus and kill individual cancer cells in their environment. To this end, we will show how we applied the 4D AFM as a guide to finetune the design of genetic nanoparticles and to film the effects of these nanoparticles on the nanomechanical and morphological profiles of individual cancer cells.

Keywords: AFM, dendrimers, nanoparticles, DNA, gene therapy, imaging

Procedia PDF Downloads 64
7059 Annealing of the Contact between Graphene and Metal: Electrical and Raman Study

Authors: A. Sakavičius, A. Lukša, V. Nargelienė, V. Bukauskas, G. Astromskas, A. Šetkus

Abstract:

We investigate the influence of annealing on the properties of a contact between graphene and metal (Au and Ni), using circular transmission line model (CTLM) contact geometry. Kelvin probe force microscopy (KPFM) and Raman spectroscopy are applied for characterization of the surface and interface properties. Annealing causes a decrease of the metal-graphene contact resistance for both Ni and Au.

Keywords: Au/Graphene contacts, graphene, Kelvin force probe microscopy, NiC/Graphene contacts, Ni/Graphene contacts, Raman spectroscopy

Procedia PDF Downloads 294
7058 Hamilton-Jacobi Treatment of Damped Motion

Authors: Khaled I. Nawafleh

Abstract:

In this work, we apply the method of Hamilton-Jacobi to obtain solutions of Hamiltonian systems in classical mechanics with two certain structures: the first structure plays a central role in the theory of time-dependent Hamiltonians, whilst the second is used to treat classical Hamiltonians, including dissipation terms. It is proved that the generalization of problems from the calculus of variation methods in the nonstationary case can be obtained naturally in Hamilton-Jacobi formalism. Then, another expression of geometry of the Hamilton Jacobi equation is retrieved for Hamiltonians with time-dependent and frictional terms. Both approaches shall be applied to many physical examples.

Keywords: Hamilton-Jacobi, time dependent lagrangians, dissipative systems, variational principle

Procedia PDF Downloads 154
7057 Using Life Cycle Assessment in Potable Water Treatment Plant: A Colombian Case Study

Authors: Oscar Orlando Ortiz Rodriguez, Raquel A. Villamizar-G, Alexander Araque

Abstract:

There is a total of 1027 municipal development plants in Colombia, 70% of municipalities had Potable Water Treatment Plants (PWTPs) in urban areas and 20% in rural areas. These PWTPs are typically supplied by surface waters (mainly rivers) and resort to gravity, pumping and/or mixed systems to get the water from the catchment point, where the first stage of the potable water process takes place. Subsequently, a series of conventional methods are applied, consisting in a more or less standardized sequence of physicochemical and, sometimes, biological treatment processes which vary depending on the quality of the water that enters the plant. These processes require energy and chemical supplies in order to guarantee an adequate product for human consumption. Therefore, in this paper, we applied the environmental methodology of Life Cycle Assessment (LCA) to evaluate the environmental loads of a potable water treatment plant (PWTP) located in northeastern Colombia following international guidelines of ISO 14040. The different stages of the potable water process, from the catchment point through pumping to the distribution network, were thoroughly assessed. The functional unit was defined as 1 m³ of water treated. The data were analyzed through the database Ecoinvent v.3.01, and modeled and processed in the software LCA-Data Manager. The results allowed determining that in the plant, the largest impact was caused by Clarifloc (82%), followed by Chlorine gas (13%) and power consumption (4%). In this context, the company involved in the sustainability of the potable water service should ideally reduce these environmental loads during the potable water process. A strategy could be the use of Clarifloc can be reduced by applying coadjuvants or other coagulant agents. Also, the preservation of the hydric source that supplies the treatment plant constitutes an important factor, since its deterioration confers unfavorable features to the water that is to be treated. By concluding, treatment processes and techniques, bioclimatic conditions and culturally driven consumption behavior vary from region to region. Furthermore, changes in treatment processes and techniques are likely to affect the environment during all stages of a plant’s operation cycle.

Keywords: climate change, environmental impact, life cycle assessment, treated water

Procedia PDF Downloads 211
7056 Epileptic Seizure Prediction by Exploiting Signal Transitions Phenomena

Authors: Mohammad Zavid Parvez, Manoranjan Paul

Abstract:

A seizure prediction method is proposed by extracting global features using phase correlation between adjacent epochs for detecting relative changes and local features using fluctuation/deviation within an epoch for determining fine changes of different EEG signals. A classifier and a regularization technique are applied for the reduction of false alarms and improvement of the overall prediction accuracy. The experiments show that the proposed method outperforms the state-of-the-art methods and provides high prediction accuracy (i.e., 97.70%) with low false alarm using EEG signals in different brain locations from a benchmark data set.

Keywords: Epilepsy, seizure, phase correlation, fluctuation, deviation.

Procedia PDF Downloads 455
7055 Tool for Maxillary Sinus Quantification in Computed Tomography Exams

Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina

Abstract:

The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.

Keywords: maxillary sinus, support vector machine, region growing, volume quantification

Procedia PDF Downloads 492
7054 Towards Update a Road Map Solution: Use of Information Obtained by the Extraction of Road Network and Its Nodes from a Satellite Image

Authors: Z. Nougrara, J. Meunier

Abstract:

In this paper, we present a new approach for extracting roads, there road network and its nodes from satellite image representing regions in Algeria. Our approach is related to our previous research work. It is founded on the information theory and the mathematical morphology. We therefore have to define objects as sets of pixels and to study the shape of these objects and the relations that exist between them. The main interest of this study is to solve the problem of the automatic mapping from satellite images. This study is thus applied for that the geographical representation of the images is as near as possible to the reality.

Keywords: nodes, road network, satellite image, updating a road map

Procedia PDF Downloads 404
7053 Calibration and Validation of the Aquacrop Model for Simulating Growth and Yield of Rain-fed Sesame (Sesamum indicum L.) Under Different Soil Fertility Levels in the Semi-arid Areas of Tigray

Authors: Abadi Berhane, Walelign Worku, Berhanu Abrha, Gebre Hadgu, Tigray

Abstract:

Sesame is an important oilseed crop in Ethiopia; which is the second most exported agricultural commodity next to coffee. However, there is poor soil fertility management and a research-led farming system for the crop. The AquaCrop model was applied as a decision-support tool; which performs a semi-quantitative approach to simulate the yield of crops under different soil fertility levels. The objective of this experiment was to calibrate and validated the AquaCrop model for simulating the growth and yield of sesame under different nitrogen fertilizer levels and to test the performance of the model as a decision-support tool for improved sesame cultivation in the study area. The experiment was laid out as a randomized complete block design (RCBD) in a factorial arrangement in the 2016, 2017, and 2018 main cropping seasons. In this experiment, four nitrogen fertilizer rates; 0, 23, 46, and 69 Kg/ha nitrogen, and three improved varieties (Setit-1, Setit-2, and Humera-1). In the meantime, growth, yield, and yield components of sesame were collected from each treatment. Coefficient of determination (R2), Root mean square error (RMSE), Normalized root mean square error (N-RMSE), Model efficiency (E), and Degree of agreement (D) were used to test the performance of the model. The results indicated that the AquaCrop model successfully simulated soil water content with R2 varying from 0.92 to 0.98, RMSE 6.5 to 13.9 mm, E 0.78 to 0.94, and D 0.95 to 0.99; and the corresponding values for AB also varied from 0.92 to 0.98, 0.33 to 0.54 tons/ha, 0.74 to 0.93, and 0.9 to 0.98, respectively. The results on the canopy cover of sesame also showed that the model acceptably simulated canopy cover with R2 varying from 0.95 to 0.99, and a RMSE of 5.3 to 8.6%. The AquaCrop model was appropriately calibrated to simulate soil water content, canopy cover, aboveground biomass, and sesame yield; the results indicated that the model adequately simulated the growth and yield of sesame under the different nitrogen fertilizer levels. The AquaCrop model might be an important tool for improved soil fertility management and yield enhancement strategies of sesame. Hence, the model might be applied as a decision-support tool in soil fertility management in sesame production.

Keywords: aquacrop model, sesame, normalized water productivity, nitrogen fertilizer

Procedia PDF Downloads 55
7052 Combined Surface Tension and Natural Convection of Nanofluids in a Square Open Cavity

Authors: Habibis Saleh, Ishak Hashim

Abstract:

Combined surface tension and natural convection heat transfer in an open cavity is studied numerically in this article. The cavity is filled with water-{Cu} nanofluids. The left wall is kept at low temperature, the right wall at high temperature and the bottom and top walls are adiabatic. The top free surface is assumed to be flat and non--deformable. Finite difference method is applied to solve the dimensionless governing equations. It is found that the insignificant effect of adding the nanoparticles were obtained about $Ma_{bf}=250$.

Keywords: natural convection, marangoni convection, nanofluids, square open cavity

Procedia PDF Downloads 534
7051 The Impact of Mergers and Acquisitions on Financial Deepening in the Nigerian Banking Sector

Authors: Onyinyechi Joy Kingdom

Abstract:

Mergers and Acquisitions (M&A) have been proposed as a mechanism through which, problems associated with inefficiency or poor performance in financial institution could be addressed. The aim of this study is to examine the proposition that recapitalization of banks, which encouraged Mergers and Acquisitions in Nigeria banking system, would strengthen the domestic banks, improve financial deepening and the confidence of depositors. Hence, this study examines the impact of the 2005 M&A in the Nigerian-banking sector on financial deepening using mixed method (quantitative and qualitative approach). The quantitative process of this study utilised annual time series for financial deepening indicator for the period of 1997 to 2012. While, the qualitative aspect adopted semi-structured interview to collate data from three merged banks and three stand-alone banks to explore, understand and complement the quantitative results. Furthermore, a framework thematic analysis is employed to analyse the themes developed using NVivo 11 software. Using the quantitative approach, findings from the equality of mean test (EMT) used suggests that M&A have significant impact on financial deepening. However, this method is not robust enough given its weak validity as it does not control for other potential factors that may determine financial deepening. Thus, to control for other factors that may affect the level of financial deepening, a Multiple Regression Model (MRM) and Interrupted Times Series Analysis (ITSA) were applied. The coefficient for M&A dummy turned negative and insignificant using MRM. In addition, the estimated linear trend of the post intervention when ITSA was applied suggests that after M&A, the level of financial deepening decreased annually; however, this was statistically insignificant. Similarly, using the qualitative approach, the results from the interview supported the quantitative results from ITSA and MRM. The result suggests that interest rate should fall when capital base is increased to improve financial deepening. Hence, this study contributes to the existing literature the importance of other factors that may affect financial deepening and the economy when policies that will enhance bank performance and the economy are made. In addition, this study will enable the use of valuable policy instruments relevant to monetary authorities when formulating policies that will strengthen the Nigerian banking sector and the economy.

Keywords: mergers and acquisitions, recapitalization, financial deepening, efficiency, financial crisis

Procedia PDF Downloads 379
7050 Single and Sequential Extraction for Potassium Fractionation and Nano-Clay Flocculation Structure

Authors: Chakkrit Poonpakdee, Jing-Hua Tzen, Ya-Zhen Huang, Yao-Tung Lin

Abstract:

Potassium (K) is a known macro nutrient and essential element for plant growth. Single leaching and modified sequential extraction schemes have been developed to estimate the relative phase associations of soil samples. The sequential extraction process is a step in analyzing the partitioning of metals affected by environmental conditions, but it is not a tool for estimation of K bioavailability. While, traditional single leaching method has been used to classify K speciation for a long time, it depend on its availability to the plants and use for potash fertilizer recommendation rate. Clay mineral in soil is a factor for controlling soil fertility. The change of the micro-structure of clay minerals during various environment (i.e. swelling or shrinking) is characterized using Transmission X-Ray Microscopy (TXM). The objective of this study are to 1) compare the distribution of K speciation between single leaching and sequential extraction process 2) determined clay particle flocculation structure before/after suspension with K+ using TXM. Four tropical soil samples: farming without K fertilizer (10 years), long term applied K fertilizer (10 years; 168-240 kg K2O ha-1 year-1), red soil (450-500 kg K2O ha-1 year-1) and forest soil were selected. The results showed that the amount of K speciation by single leaching method were high in mineral K, HNO3 K, Non-exchangeable K, NH4OAc K, exchangeable K and water soluble K respectively. Sequential extraction process indicated that most K speciations in soil were associated with residual, organic matter, Fe or Mn oxide and exchangeable fractions and K associate fraction with carbonate was not detected in tropical soil samples. In farming long term applied K fertilizer and red soil were higher exchangeable K than farming long term without K fertilizer and forest soil. The results indicated that one way to increase the available K (water soluble K and exchangeable K) should apply K fertilizer and organic fertilizer for providing available K. The two-dimension of TXM image of clay particles suspension with K+ shows that the aggregation structure of clay mineral closed-void cellular networks. The porous cellular structure of soil aggregates in 1 M KCl solution had large and very larger empty voids than in 0.025 M KCl and deionized water respectively. TXM nanotomography is a new technique can be useful in the field as a tool for better understanding of clay mineral micro-structure.

Keywords: potassium, sequential extraction process, clay mineral, TXM

Procedia PDF Downloads 270
7049 Implementation of Efficiency and Energy Conservation Concept in Office Building as an Effort to Achieve Green Office Building Case Studies Office Building in Jakarta

Authors: Jarwa Prasetya Sih Handoko

Abstract:

The issue of energy crisis for big cities in Indonesia are issues raised in line with the development of the city is rapidly increasing. Various attempts were made by the government in overcoming problems of energy needs in Indonesia. In addition to the efforts of the government required the efforts made by the public to solve this problem. The concept of green building in the design of the building with efforts to use energy efficiently can be one of the efforts that can be applied to solve this problem. Jakarta is capital and the one of the major cities in Indonesia with high economic growth. This leads to increased demand for office space for the people. So that the construction of office buildings in big cities like Jakarta very numerous. Office building is one of the buildings that require large energy consumption. As a building that could potentially require huge amounts of energy, the design should consider the use of energy to help provide solutions to problems of energy crisis in Indonesia. The concept of energy efficient is one of the concepts addressed in an effort to use energy in buildings to save energy needs of the building operations. Therefore, it is necessary to have a study that explores the application of the concept of energy efficiency and conservation in office buildings in Jakarta. In this study using two (2) buildings case study that Sequis Center Building and Sampoerna Strategic Square. Both are office buildings in Jakarta have earned the Green Building Certificate of Green Building Council Indonesia (GBCI). The study used literature review methods to address issues raised earlier. Whether it's related to a literature review on the study of office buildings and green building. With this paper is expected to be obtained on the application of the concept of energy efficiency and conservation in office buildings that have earned recognition as a green building by GBCI. The result could be a reference to the architect in designing the next office buildings, especially related to the concept of energy use in buildings. From this study, it can be concluded that the concept of energy efficiency and conservation in the design of office buildings can be applied to its orientation, the openings, the use shade in buildings, vegetation and building material selection and efficient use of water. So that it can reduce energy requirements needed to meet the needs of the building user activity. So the concept of energy efficiency and conservation in office buildings can be one of the efforts to realize the Green Office Building. Recommendations from this study is that the design of office buildings should be able to apply the concept of energy utilization in the design office. This is to meet the energy needs of the office buildings in an effort to realize the Green Building.

Keywords: energy crisis, energy efficiency, energy conservation, green building, office building

Procedia PDF Downloads 285
7048 Explosion Mechanics of Aluminum Plates Subjected to the Combined Effect of Blast Wave and Fragment Impact Loading: A Multicase Computational Modeling Study

Authors: Atoui Oussama, Maazoun Azer, Belkassem Bachir, Pyl Lincy, Lecompte David

Abstract:

For many decades, researchers have been focused on understanding the dynamic behavior of different structures and materials subjected to fragment impact or blast loads separately. The explosion mechanics, as well as the impact physics studies dealing with the numerical modeling of the response of protective structures under the synergistic effect of a blast wave and the impact of fragments, are quite limited in the literature. This article numerically evaluates the nonlinear dynamic behavior and damage mechanisms of Aluminum plates EN AW-1050A- H24 under different combined loading scenarios varied by the sequence of the applied loads using the commercial software LS-DYNA. For one hand, with respect to the terminal ballistic field investigations, a Lagrangian (LAG) formulation is used to evaluate the different failure modes of the target material in case of a fragment impact. On the other hand, with respect to the blast field analysis, an Arbitrary Lagrangian-Eulerian (ALE) formulation is considered to study the fluid-structure interaction (FSI) of the shock wave and the plate in case of a blast loading. Four different loading scenarios are considered: (1) only blast loading, (2) only fragment impact, (3) blast loading followed by a fragment impact and (4) a fragment impact followed by blast loading. From the numerical results, it was observed that when the impact load is applied to the plate prior to the blast load, it suffers more severe damage due to the hole enlargement phenomenon and the effects of crack propagation on the circumference of the damaged zone. Moreover, it was found that the hole from the fragment impact loading was enlarged to about three times in diameter as compared to the diameter of the projectile. The validation of the proposed computational model is based in part on previous experimental data obtained by the authors and in the other part on experimental data obtained from the literature. A good correspondence between the numerical and experimental results is found.

Keywords: computational analysis, combined loading, explosion mechanics, hole enlargement phenomenon, impact physics, synergistic effect, terminal ballistic

Procedia PDF Downloads 164
7047 Effects of Surface Textures and Chemistries on Wettability

Authors: Dipti Raj, Himanshu Mishra

Abstract:

Wetting of a solid surface by a liquid is an extremely common yet subtle phenomenon in natural and applied sciences. A clear understanding of both short and long-term wetting behaviors of surfaces is essential for creating robust anti-biofouling coatings, non-wetting textiles, non-fogging mirrors, and preventive linings against dirt and icing. In this study, silica beads (diameter, D ≈ 100 μm) functionalized using different silane reagents were employed to modify the wetting characteristics of smooth polydimethylsiloxane (PDMS) surfaces. Resulting composite surfaces were found to be super-hydrophobic, i.e. contact angle of water,

Keywords: contact angle, Cassie-Baxter, PDMS, silica, texture, wetting

Procedia PDF Downloads 237
7046 Functions and Challenges of New County-Based Regional Plan in Taiwan

Authors: Yu-Hsin Tsai

Abstract:

A new, mandated county regional plan system has been initiated since 2010 nationwide in Taiwan, with its role situated in-between the policy-led cross-county regional plan and the blueprint-led city plan. This new regional plan contain both urban and rural areas in one single plan, which provides a more complete planning territory, i.e., city region within the county’s jurisdiction, and to be executed and managed effectively by the county government. However, the full picture of its functions and characteristics seems still not totally clear, compared with other levels of plans; either are planning goals and issues that can be most appropriately dealt with at this spatial scale. In addition, the extent to which the inclusion of sustainability ideal and measures to cope with climate change are unclear. Based on the above issues, this study aims to clarify the roles of county regional plan, to analyze the extent to which the measures cope with sustainability, climate change, and forecasted declining population, and the success factors and issues faced in the planning process. The methodology applied includes literature review, plan quality evaluation, and interview with officials of the central and local governments and urban planners involved for all the 23 counties in Taiwan. The preliminary research results show, first, growth management related policies have been widely implemented and expected to have effective impact, including incorporating resources capacity to determine maximum population for the city region as a whole, developing overall vision of urban growth boundary for all the whole city region, prioritizing infill development, and use of architectural land within urbanized area over rural area to cope with urban growth. Secondly, planning-oriented zoning is adopted in urban areas, while demand-oriented planning permission is applied in the rural areas with designated plans. Then, public participation has been evolved to the next level to oversee all of government’s planning and review processes due to the decreasing trust in the government, and development of public forum on the internet etc. Next, fertile agricultural land is preserved to maintain food self-supplied goal for national security concern. More adoption-based methods than mitigation-based methods have been applied to cope with global climate change. Finally, better land use and transportation planning in terms of avoiding developing rail transit stations and corridor in rural area is promoted. Even though many promising, prompt measures have been adopted, however, challenges exist to surround: first, overall urban density, likely affecting success of UGB, or use of rural agricultural land, has not been incorporated, possibly due to implementation difficulties. Second, land-use related measures to mitigating climate change seem less clear and hence less employed. Smart decline has not drawn enough attention to cope with predicted population decrease in the next decade. Then, some reluctance from county’s government to implement county regional plan can be observed vaguely possibly since limits have be set on further development on agricultural land and sensitive areas. Finally, resolving issue on existing illegal factories on agricultural land remains the most challenging dilemma.

Keywords: city region plan, sustainability, global climate change, growth management

Procedia PDF Downloads 331
7045 Lipase-Mediated Formation of Peroxyoctanoic Acid Used in Catalytic Epoxidation of α-Pinene

Authors: N. Wijayati, Kusoro Siadi, Hanny Wijaya, Maggy Thenawijjaja Suhartono

Abstract:

This work describes the lipase-mediated synthesis of α-pinene oxide at ambient temperature. The immobilized lipase from Pseudomonas aeruginosa is used to generate peroxyoctanoic acid directly from octanoic acid and hydrogen peroxide. The peroxy acid formed is then applied for in situ oxidation of α-pinene. High conversion of α-pinene to α-pinene oxide (approximately 78%) was achieved when using 0,1 g enzim lipase, 6 mmol H2O2, dan 5 mmol octanoic acid. Various parameters affecting the conversion of α-pinene to α pinene oxide were studied.

Keywords: α-Pinene; P. aeruginosa; Octanoic acid

Procedia PDF Downloads 260
7044 Effects of Irrigation Applications during Post-Anthesis Period on Flower Development and Pyrethrin Accumulation in Pyrethrum

Authors: Dilnee D. Suraweera, Tim Groom, Brian Chung, Brendan Bond, Andrew Schipp, Marc E. Nicolas

Abstract:

Pyrethrum (Tanacetum cinerariifolium) is a perennial plant belongs to family Asteraceae. This is cultivated commercially for extraction of natural insecticide pyrethrins, which accumulates in their flower head achenes. Approximately 94% of the pyrethrins are produced within secretory ducts and trichomes of achenes of the mature pyrethrum flower. This is the most widely used botanical insecticide in the world and Australia is the current largest pyrethrum producer in the world. Rainfall in pyrethrum growing regions in Australia during pyrethrum flowering period, in late spring and early summer is significantly less. Due to lack of adequate soil moisture and under elevated temperature conditions during post-anthesis period, resulting in yield reductions. Therefore, understanding of yield responses of pyrethrum to irrigation is important for Pyrethrum as a commercial crop. Irrigation management has been identified as a key area of pyrethrum crop management strategies that could be manipulated to increase yield. Pyrethrum is a comparatively drought tolerant plant and it has some ability to survive in dry conditions due to deep rooting. But in dry areas and in dry seasons, the crop cannot reach to its full yield potential without adequate soil moisture. Therefore, irrigation is essential during the flowering period prevent crop water stress and maximise yield. Irrigation during the water deficit period results in an overall increased rate of water uptake and growth by the plant which is essential to achieve the maximum yield benefits from commercial crops. The effects of irrigation treatments applied at post-anthesis period on pyrethrum yield responses were studied in two irrigation methods. This was conducted in a first harvest commercial pyrethrum field in Waubra, Victoria, during 2012/2013 season. Drip irrigation and overhead sprinkler irrigation treatments applied during whole flowering period were compared with ‘rainfed’ treatment in relation to flower yield and pyrethrin yield responses. The results of this experiment showed that the application of 180mm of irrigation throughout the post-anthesis period, from early flowering stages to physiological maturity under drip irrigation treatment increased pyrethrin concentration by 32%, which combined with the 95 % increase in the flower yield to give a total pyrethrin yield increase of 157%, compared to the ‘rainfed’ treatment. In contrast to that overhead sprinkler irrigation treatment increased pyrethrin concentration by 19%, which combined with the 60 % increase in the flower yield to give a total pyrethrin yield increase of 91%, compared to the ‘rainfed’ treatment. Irrigation treatments applied throughout the post-anthesis period significantly increased flower yield as a result of enhancement of number of flowers and flower size. Irrigation provides adequate soil moisture for flower development in pyrethrum which slows the rate of flower development and increases the length of the flowering period, resulting in a delayed crop harvest (11 days) compared to the ‘rainfed’ treatment. Overall, irrigation has a major impact on pyrethrin accumulation which increases the rate and duration of pyrethrin accumulation resulting in higher pyrethrin yield per flower at physiological maturity. The findings of this study will be important for future yield predictions and to develop advanced agronomic strategies to maximise pyrethrin yield in pyrethrum.

Keywords: achene, drip irrigation, overhead irrigation, pyrethrin

Procedia PDF Downloads 393
7043 Reliability of Dry Tissues Sampled from Exhumed Bodies in DNA Analysis

Authors: V. Agostini, S. Gino, S. Inturri, A. Piccinini

Abstract:

In cases of corpse identification or parental testing performed on exhumed alleged dead father, usually, we seek and acquire organic samples as bones and/or bone fragments, teeth, nails and muscle’s fragments. The DNA analysis of these cadaveric matrices usually leads to identifying success, but it often happens that the results of the typing are not satisfactory with highly degraded, partial or even non-interpretable genetic profiles. To aggravate the interpretative panorama deriving from the analysis of such 'classical' organic matrices, we must add a long and laborious treatment of the sample that starts from the mechanical fragmentation up to the protracted decalcification phase. These steps greatly increase the chance of sample contamination. In the present work, instead, we want to report the use of 'unusual' cadaveric matrices, demonstrating that their forensic genetics analysis can lead to better results in less time and with lower costs of reagents. We report six case reports, result of on-field experience, in which eyeswabs and cartilage were sampled and analyzed, allowing to obtain clear single genetic profiles, useful for identification purposes. In all cases we used the standard DNA tissue extraction protocols (as reported on the user manuals of the manufacturers such as QIAGEN or Invitrogen- Thermo Fisher Scientific), thus bypassing the long and difficult phases of mechanical fragmentation and decalcification of bones' samples. PCR was carried out using PowerPlex® Fusion System kit (Promega), and capillary electrophoresis was carried out on an ABI PRISM® 310 Genetic Analyzer (Applied Biosystems®), with GeneMapper ID v3.2.1 (Applied Biosystems®) software. The software Familias (version 3.1.3) was employed for kinship analysis. The genetic results achieved have proved to be much better than the analysis of bones or nails, both from the qualitative and quantitative point of view and from the point of view of costs and timing. This way, by using the standard procedure of DNA extraction from tissue, it is possible to obtain, in a shorter time and with maximum efficiency, an excellent genetic profile, which proves to be useful and can be easily decoded for later paternity tests and/or identification of human remains.

Keywords: DNA, eye swabs and cartilage, identification human remains, paternity testing

Procedia PDF Downloads 93