Search results for: applied mechanics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8570

Search results for: applied mechanics

7070 Parameter Estimation in Dynamical Systems Based on Latent Variables

Authors: Arcady Ponosov

Abstract:

A novel mathematical approach is suggested, which facilitates a compressed representation and efficient validation of parameter-rich ordinary differential equation models describing the dynamics of complex, especially biology-related, systems and which is based on identification of the system's latent variables. In particular, an efficient parameter estimation method for the compressed non-linear dynamical systems is developed. The method is applied to the so-called 'power-law systems' being non-linear differential equations typically used in Biochemical System Theory.

Keywords: generalized law of mass action, metamodels, principal components, synergetic systems

Procedia PDF Downloads 340
7069 The Effectiveness of a Courseware in 7th Grade Chemistry Lesson

Authors: Oguz Ak

Abstract:

In this study a courseware for the learning unit of `Properties of matters` in chemistry course is developed. The courseware is applied to 15 7th grade (about age 14) students in real settings. As a result of the study it is found that the students` grade in the learning unit significantly increased when they study the courseware themselves. In addition, the score improvements of the students who found the courseware is usable is not significantly higher than the score improvements of the students who did not found it usable.

Keywords: computer based instruction, effect of courseware and usability of courseware, 7th grade

Procedia PDF Downloads 450
7068 Optical Imaging Based Detection of Solder Paste in Printed Circuit Board Jet-Printing Inspection

Authors: D. Heinemann, S. Schramm, S. Knabner, D. Baumgarten

Abstract:

Purpose: Applying solder paste to printed circuit boards (PCB) with stencils has been the method of choice over the past years. A new method uses a jet printer to deposit tiny droplets of solder paste through an ejector mechanism onto the board. This allows for more flexible PCB layouts with smaller components. Due to the viscosity of the solder paste, air blisters can be trapped in the cartridge. This can lead to missing solder joints or deviations in the applied solder volume. Therefore, a built-in and real-time inspection of the printing process is needed to minimize uncertainties and increase the efficiency of the process by immediate correction. The objective of the current study is the design of an optimal imaging system and the development of an automatic algorithm for the detection of applied solder joints from optical from the captured images. Methods: In a first approach, a camera module connected to a microcomputer and LED strips are employed to capture images of the printed circuit board under four different illuminations (white, red, green and blue). Subsequently, an improved system including a ring light, an objective lens, and a monochromatic camera was set up to acquire higher quality images. The obtained images can be divided into three main components: the PCB itself (i.e., the background), the reflections induced by unsoldered positions or screw holes and the solder joints. Non-uniform illumination is corrected by estimating the background using a morphological opening and subtraction from the input image. Image sharpening is applied in order to prevent error pixels in the subsequent segmentation. The intensity thresholds which divide the main components are obtained from the multimodal histogram using three probability density functions. Determining the intersections delivers proper thresholds for the segmentation. Remaining edge gradients produces small error areas which are removed by another morphological opening. For quantitative analysis of the segmentation results, the dice coefficient is used. Results: The obtained PCB images show a significant gradient in all RGB channels, resulting from ambient light. Using different lightings and color channels 12 images of a single PCB are available. A visual inspection and the investigation of 27 specific points show the best differentiation between those points using a red lighting and a green color channel. Estimating two thresholds from analyzing the multimodal histogram of the corrected images and using them for segmentation precisely extracts the solder joints. The comparison of the results to manually segmented images yield high sensitivity and specificity values. Analyzing the overall result delivers a Dice coefficient of 0.89 which varies for single object segmentations between 0.96 for a good segmented solder joints and 0.25 for single negative outliers. Conclusion: Our results demonstrate that the presented optical imaging system and the developed algorithm can robustly detect solder joints on printed circuit boards. Future work will comprise a modified lighting system which allows for more precise segmentation results using structure analysis.

Keywords: printed circuit board jet-printing, inspection, segmentation, solder paste detection

Procedia PDF Downloads 322
7067 Development and Effects of Transtheoretical Model Exercise Program for Elderly Women with Chronic Back Pain

Authors: Hyun-Ju Oh, Soon-Rim Suh, Mihan Kim

Abstract:

The steady and rapid increase of the older population is a global phenomenon. Chronic diseases and disabilities are increased due to aging. In general, exercise has been known to be most effective in preventing and managing chronic back pain. However, it is hard for the older women to initiate and maintain the exercise. Transtheoretical model (TTM) is one of the theories explain behavioral changes such as exercise. The application of the program considering the stage of behavior change is effective for the elderly woman to start and maintain the exercise. The purpose of this study was to develop TTM based exercise program and to examine its effect for elderly women with chronic back-pain. For the program evaluation, the non-equivalent control pre-posttest design was applied. The independent variable of this study is exercise intervention program. The contents of the program were constructed considering the characteristics of the elderly women with chronic low back pain, focusing on the process of change, the stage of change by the previous studies. The developed exercise program was applied to the elderly women with chronic low back pain in the planning stage and the preparation stage. The subjects were 50 older women over 65 years of age with chronic back-pain who did not practice regular exercise. The experimental group (n=25) received the 8weeks TTM based exercise program. The control group received the book which named low back pain management. Data were collected at three times: before the exercise intervention, right after the intervention, and 4weeks after the intervention. The dependent variables were the processes of change, decisional balance, exercise self-efficacy, back-pain, depression and muscle strength. The results of this study were as follows. Processes of change (<.001), pros of decisional balance (<.001), exercise self-efficacy (<.001), back pain (<.001), depression (<.001), muscle strength (<.001) were higher in the experimental group than in the control group right after the program and 4weeks after the programs. The results of this study show that applying the TTM based exercise program increases the use of the change process, increases the exercise self-efficacy, increases the stage of changing the exercise behavior and strengthens the muscular strength by lowering the degree of pain and depression Respectively. The significance of the study was to confirm the effect of continuous exercise by maintaining regular exercise habits by applying exercise program of the transtheoretical model to the chronic low back pain elderly with exercise intention.

Keywords: chronic back pain, elderly, exercise, women

Procedia PDF Downloads 243
7066 Literature Review and Evaluation of the Internal Marketing Theory

Authors: Hsiao Hsun Yuan

Abstract:

Internal marketing was proposed in 1970s. The theory of the concept has continually changed over the past forty years. This study discussed the following themes: the definition and implication of internal marketing, the progress of its development, and the evolution of its theoretical model. Moreover, the study systematically organized the strategies of the internal marketing theory adopted on enterprise and how they were put into practice. It also compared the empirical studies focusing on how the existent theories influenced the important variables of internal marketing. The results of this study are expected to serve as references for future exploration of the boundary and studies aiming at how internal marketing is applied to different types of enterprises.

Keywords: corporate responsibility, employee organizational performance, internal marketing, internal customer

Procedia PDF Downloads 337
7065 Numerical Simulation of Encased Composite Column Bases Subjected to Cyclic Loading

Authors: Eman Ismail, Adnan Masri

Abstract:

Energy dissipation in ductile moment frames occurs mainly through plastic hinge rotations in its members (beams and columns). Generally, plastic hinge locations are pre-determined and limited to the beam ends, where columns are designed to remain elastic in order to avoid premature instability (aka story mechanisms) with the exception of column bases, where a base is 'fixed' in order to provide higher stiffness and stability and to form a plastic hinge. Plastic hinging at steel column bases in ductile moment frames using conventional base connection details is accompanied by several complications (thicker and heavily stiffened connections, larger embedment depths, thicker foundation to accommodate anchor rod embedment, etc.). An encased composite base connection is proposed where a segment of the column beginning at the base up to a certain point along its height is encased in reinforced concrete with headed shear studs welded to the column flanges used to connect the column to the concrete encasement. When the connection is flexurally loaded, stresses are transferred to a reinforced concrete encasement through the headed shear studs, and thereby transferred to the foundation by reinforced concrete mechanics, and axial column forces are transferred through the base-plate assembly. Horizontal base reactions are expected to be transferred by the direct bearing of the outer and inner faces of the flanges; however, investigation of this mechanism is not within the scope of this research. The inelastic and cyclic behavior of the connection will be investigated where it will be subjected to reversed cyclic loading, and rotational ductility will be observed in cases of yielding mechanisms where yielding occurs as flexural yielding in the beam-column, shear yielding in headed studs, and flexural yielding of the reinforced concrete encasement. The findings of this research show that the connection is capable of achieving satisfactory levels of ductility in certain conditions given proper detailing and proportioning of elements.

Keywords: seismic design, plastic mechanisms steel structure, moment frame, composite construction

Procedia PDF Downloads 114
7064 Optical Whitening of Textiles: Teaching and Learning Materials

Authors: C. W. Kan

Abstract:

This study examines the results of optical whitening process of different textiles such as cotton, wool and polyester. The optical whitening agents used are commercially available products, and the optical whitening agents were applied to the textiles with manufacturers’ suggested methods. The aim of this study is to illustrate the proper application methods of optical whitening agent to different textiles and hence to provide guidance note to the students in learning this topic. Acknowledgment: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.

Keywords: learning materials, optical whitening agent, wool, cotton, polyester

Procedia PDF Downloads 408
7063 The Routine Use of a Negative Pressure Incision Management System in Vascular Surgery: A Case Series

Authors: Hansraj Bookun, Angela Tan, Rachel Xuan, Linheng Zhao, Kejia Wang, Animesh Singla, David Kim, Christopher Loupos

Abstract:

Introduction: Incisional wound complications in vascular surgery patients represent a significant clinical and econometric burden of morbidity and mortality. The objective of this study was to trial the feasibility of applying the Prevena negative pressure incision management system as a routine dressing in patients who had undergone arterial surgery. Conventionally, Prevena has been applied to groin incisions, but this study features applications on multiple wound sites such as the thigh or major amputation stumps. Method: This was a cross-sectional observational, single-centre case series of 12 patients who had undergone major vascular surgery. Their wounds were managed with the Prevena system being applied either intra-operatively or on the first post-operative day. Demographic and operative details were collated as well as the length of stay and complication rates. Results: There were 9 males (75%) with mean age of 66 years and the comorbid burden was as follows: ischaemic heart disease (92%), diabetes (42%), hypertension (100%), stage 4 or greater kidney impairment (17%) and current or ex-smoking (83%). The main indications were acute ischaemia (33%), claudication (25%), and gangrene (17%). There were single instances of an occluded popliteal artery aneurysm, diabetic foot infection, and rest pain. The majority of patients (50%) had hybrid operations with iliofemoral endarterectomies, patch arterioplasties, and further peripheral endovascular treatment. There were 4 complex arterial bypass operations and 2 major amputations. The mean length of stay was 17 ± 10 days, with a range of 4 to 35 days. A single complication, in the form of a lymphocoele, was encountered in the context of an iliofemoral endarterectomy and patch arterioplasty. This was managed conservatively. There were no deaths. Discussion: The Prevena wound management system shows that in conjunction with safe vascular surgery, absolute wound complication rates remain low and that it remains a valuable adjunct in the treatment of vasculopaths.

Keywords: wound care, negative pressure, vascular surgery, closed incision

Procedia PDF Downloads 115
7062 Design of a Fuzzy Luenberger Observer for Fault Nonlinear System

Authors: Mounir Bekaik, Messaoud Ramdani

Abstract:

We present in this work a new technique of stabilization for fault nonlinear systems. The approach we adopt focus on a fuzzy Luenverger observer. The T-S approximation of the nonlinear observer is based on fuzzy C-Means clustering algorithm to find local linear subsystems. The MOESP identification approach was applied to design an empirical model describing the subsystems state variables. The gain of the observer is given by the minimization of the estimation error through Lyapunov-krasovskii functional and LMI approach. We consider a three tank hydraulic system for an illustrative example.

Keywords: nonlinear system, fuzzy, faults, TS, Lyapunov-Krasovskii, observer

Procedia PDF Downloads 311
7061 Comparing Test Equating by Item Response Theory and Raw Score Methods with Small Sample Sizes on a Study of the ARTé: Mecenas Learning Game

Authors: Steven W. Carruthers

Abstract:

The purpose of the present research is to equate two test forms as part of a study to evaluate the educational effectiveness of the ARTé: Mecenas art history learning game. The researcher applied Item Response Theory (IRT) procedures to calculate item, test, and mean-sigma equating parameters. With the sample size n=134, test parameters indicated “good” model fit but low Test Information Functions and more acute than expected equating parameters. Therefore, the researcher applied equipercentile equating and linear equating to raw scores and compared the equated form parameters and effect sizes from each method. Item scaling in IRT enables the researcher to select a subset of well-discriminating items. The mean-sigma step produces a mean-slope adjustment from the anchor items, which was used to scale the score on the new form (Form R) to the reference form (Form Q) scale. In equipercentile equating, scores are adjusted to align the proportion of scores in each quintile segment. Linear equating produces a mean-slope adjustment, which was applied to all core items on the new form. The study followed a quasi-experimental design with purposeful sampling of students enrolled in a college level art history course (n=134) and counterbalancing design to distribute both forms on the pre- and posttests. The Experimental Group (n=82) was asked to play ARTé: Mecenas online and complete Level 4 of the game within a two-week period; 37 participants completed Level 4. Over the same period, the Control Group (n=52) did not play the game. The researcher examined between group differences from post-test scores on test Form Q and Form R by full-factorial Two-Way ANOVA. The raw score analysis indicated a 1.29% direct effect of form, which was statistically non-significant but may be practically significant. The researcher repeated the between group differences analysis with all three equating methods. For the IRT mean-sigma adjusted scores, form had a direct effect of 8.39%. Mean-sigma equating with a small sample may have resulted in inaccurate equating parameters. Equipercentile equating aligned test means and standard deviations, but resultant skewness and kurtosis worsened compared to raw score parameters. Form had a 3.18% direct effect. Linear equating produced the lowest Form effect, approaching 0%. Using linearly equated scores, the researcher conducted an ANCOVA to examine the effect size in terms of prior knowledge. The between group effect size for the Control Group versus Experimental Group participants who completed the game was 14.39% with a 4.77% effect size attributed to pre-test score. Playing and completing the game increased art history knowledge, and individuals with low prior knowledge tended to gain more from pre- to post test. Ultimately, researchers should approach test equating based on their theoretical stance on Classical Test Theory and IRT and the respective  assumptions. Regardless of the approach or method, test equating requires a representative sample of sufficient size. With small sample sizes, the application of a range of equating approaches can expose item and test features for review, inform interpretation, and identify paths for improving instruments for future study.

Keywords: effectiveness, equipercentile equating, IRT, learning games, linear equating, mean-sigma equating

Procedia PDF Downloads 182
7060 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 315
7059 Sequential Covering Algorithm for Nondifferentiable Global Optimization Problem and Applications

Authors: Mohamed Rahal, Djaouida Guetta

Abstract:

In this paper, the one-dimensional unconstrained global optimization problem of continuous functions satifying a Hölder condition is considered. We extend the algorithm of sequential covering SCA for Lipschitz functions to a large class of Hölder functions. The convergence of the method is studied and the algorithm can be applied to systems of nonlinear equations. Finally, some numerical examples are presented and illustrate the efficiency of the present approach.

Keywords: global optimization, Hölder functions, sequential covering method, systems of nonlinear equations

Procedia PDF Downloads 352
7058 Legal Means for Access to Information Management

Authors: Sameut Bouhaik Mostafa

Abstract:

Information Act is the Canadian law gives the right of access to information for the institution of government. It declares the availability of government information to the public, but that exceptions should be limited and the necessary right of access to be specific, and also states the need to constantly re-examine the decisions on the disclosure of any government information independently from the government. By 1982, it enacted a dozen countries, including France, Denmark, Finland, Sweden, the Netherlands and the United States (1966) newly legally to access the information. It entered access to Canadian information into force of the Act of 1983, under the government of Pierre Trudeau, allowing Canadians to recover information from government files, and the development of what can be accessed from the information, and the imposition of timetables to respond. It has been applied by the Information Commissioner in Canada.

Keywords: law, information, management, legal

Procedia PDF Downloads 397
7057 Applications of AFM in 4D to Optimize the Design of Genetic Nanoparticles

Authors: Hosam Abdelhady

Abstract:

Filming the behaviors of individual DNA molecules in their environment when they interact with individual medicinal nano-polymers in a molecular scale has opened the door to understand the effect of the molecular shape, size, and incubation time with nanocarriers on optimizing the design of robust genetic Nano molecules able to resist the enzymatic degradation, enter the cell, reach to the nucleus and kill individual cancer cells in their environment. To this end, we will show how we applied the 4D AFM as a guide to finetune the design of genetic nanoparticles and to film the effects of these nanoparticles on the nanomechanical and morphological profiles of individual cancer cells.

Keywords: AFM, dendrimers, nanoparticles, DNA, gene therapy, imaging

Procedia PDF Downloads 64
7056 Annealing of the Contact between Graphene and Metal: Electrical and Raman Study

Authors: A. Sakavičius, A. Lukša, V. Nargelienė, V. Bukauskas, G. Astromskas, A. Šetkus

Abstract:

We investigate the influence of annealing on the properties of a contact between graphene and metal (Au and Ni), using circular transmission line model (CTLM) contact geometry. Kelvin probe force microscopy (KPFM) and Raman spectroscopy are applied for characterization of the surface and interface properties. Annealing causes a decrease of the metal-graphene contact resistance for both Ni and Au.

Keywords: Au/Graphene contacts, graphene, Kelvin force probe microscopy, NiC/Graphene contacts, Ni/Graphene contacts, Raman spectroscopy

Procedia PDF Downloads 294
7055 Using Life Cycle Assessment in Potable Water Treatment Plant: A Colombian Case Study

Authors: Oscar Orlando Ortiz Rodriguez, Raquel A. Villamizar-G, Alexander Araque

Abstract:

There is a total of 1027 municipal development plants in Colombia, 70% of municipalities had Potable Water Treatment Plants (PWTPs) in urban areas and 20% in rural areas. These PWTPs are typically supplied by surface waters (mainly rivers) and resort to gravity, pumping and/or mixed systems to get the water from the catchment point, where the first stage of the potable water process takes place. Subsequently, a series of conventional methods are applied, consisting in a more or less standardized sequence of physicochemical and, sometimes, biological treatment processes which vary depending on the quality of the water that enters the plant. These processes require energy and chemical supplies in order to guarantee an adequate product for human consumption. Therefore, in this paper, we applied the environmental methodology of Life Cycle Assessment (LCA) to evaluate the environmental loads of a potable water treatment plant (PWTP) located in northeastern Colombia following international guidelines of ISO 14040. The different stages of the potable water process, from the catchment point through pumping to the distribution network, were thoroughly assessed. The functional unit was defined as 1 m³ of water treated. The data were analyzed through the database Ecoinvent v.3.01, and modeled and processed in the software LCA-Data Manager. The results allowed determining that in the plant, the largest impact was caused by Clarifloc (82%), followed by Chlorine gas (13%) and power consumption (4%). In this context, the company involved in the sustainability of the potable water service should ideally reduce these environmental loads during the potable water process. A strategy could be the use of Clarifloc can be reduced by applying coadjuvants or other coagulant agents. Also, the preservation of the hydric source that supplies the treatment plant constitutes an important factor, since its deterioration confers unfavorable features to the water that is to be treated. By concluding, treatment processes and techniques, bioclimatic conditions and culturally driven consumption behavior vary from region to region. Furthermore, changes in treatment processes and techniques are likely to affect the environment during all stages of a plant’s operation cycle.

Keywords: climate change, environmental impact, life cycle assessment, treated water

Procedia PDF Downloads 212
7054 Epileptic Seizure Prediction by Exploiting Signal Transitions Phenomena

Authors: Mohammad Zavid Parvez, Manoranjan Paul

Abstract:

A seizure prediction method is proposed by extracting global features using phase correlation between adjacent epochs for detecting relative changes and local features using fluctuation/deviation within an epoch for determining fine changes of different EEG signals. A classifier and a regularization technique are applied for the reduction of false alarms and improvement of the overall prediction accuracy. The experiments show that the proposed method outperforms the state-of-the-art methods and provides high prediction accuracy (i.e., 97.70%) with low false alarm using EEG signals in different brain locations from a benchmark data set.

Keywords: Epilepsy, seizure, phase correlation, fluctuation, deviation.

Procedia PDF Downloads 455
7053 Tool for Maxillary Sinus Quantification in Computed Tomography Exams

Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina

Abstract:

The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.

Keywords: maxillary sinus, support vector machine, region growing, volume quantification

Procedia PDF Downloads 492
7052 Hard and Soft Skills in Marketing Education: Using Serious Games to Engage Higher Order Processing

Authors: Ann Devitt, Mairead Brady, Markus Lamest, Stephen Gomez

Abstract:

This study set out to explore the use of an online collaborative serious game for student learning in a postgraduate introductory marketing module. The simulation game aimed to bridge the theory-practice divide in marketing by allowing students to apply theory in a safe, simulated marketplace. This study addresses the following research questions: Does an online marketing simulation game engage students higher order cognitive skills? Does collaborative activity required develop students’ “soft” skills, such as communication and negotiation? What specific affordances of the online simulation promote learning? This qualitative case study took place in 2014 with 40 postgraduate students on a Business Masters Programme. The two-week intensive module combined lectures with collaborative activity on a marketing simulation game, MMX from Pearsons. The game requires student teams to compete against other teams in a marketplace and design a marketing plan to maximize key performance indicators. The data for this study comprise essays written by students after the module reflecting on their learning on the module. A thematic analysis was conducted of the essays using the following a priori theme sets: 6 levels of the cognitive domain of Blooms taxonomy; 5 principles of Cooperative Learning; affordances of simulation environments including experiential learning; motivation and engagement; goal orientation. Preliminary findings would strongly suggest that the game facilitated students identifying the value of theory in practice, in particular for future employment; enhanced their understanding of group dynamics and their role within that; and impacted very strongly, both positively and negatively on motivation. In particular the game mechanics of MMX, which hinges on the correct identification of a target consumer group, was identified as a key determinant of extrinsic and intrinsic motivation for learners. The findings also suggest that the situation of the simulation game within a broader module which required post-game reflection was valuable in identifying key learning of marketing concepts in both the positive and the negative experiences of the game.

Keywords: simulation, marketing, serious game, cooperative learning, bloom's taxonomy

Procedia PDF Downloads 539
7051 Towards Update a Road Map Solution: Use of Information Obtained by the Extraction of Road Network and Its Nodes from a Satellite Image

Authors: Z. Nougrara, J. Meunier

Abstract:

In this paper, we present a new approach for extracting roads, there road network and its nodes from satellite image representing regions in Algeria. Our approach is related to our previous research work. It is founded on the information theory and the mathematical morphology. We therefore have to define objects as sets of pixels and to study the shape of these objects and the relations that exist between them. The main interest of this study is to solve the problem of the automatic mapping from satellite images. This study is thus applied for that the geographical representation of the images is as near as possible to the reality.

Keywords: nodes, road network, satellite image, updating a road map

Procedia PDF Downloads 405
7050 Calibration and Validation of the Aquacrop Model for Simulating Growth and Yield of Rain-fed Sesame (Sesamum indicum L.) Under Different Soil Fertility Levels in the Semi-arid Areas of Tigray

Authors: Abadi Berhane, Walelign Worku, Berhanu Abrha, Gebre Hadgu, Tigray

Abstract:

Sesame is an important oilseed crop in Ethiopia; which is the second most exported agricultural commodity next to coffee. However, there is poor soil fertility management and a research-led farming system for the crop. The AquaCrop model was applied as a decision-support tool; which performs a semi-quantitative approach to simulate the yield of crops under different soil fertility levels. The objective of this experiment was to calibrate and validated the AquaCrop model for simulating the growth and yield of sesame under different nitrogen fertilizer levels and to test the performance of the model as a decision-support tool for improved sesame cultivation in the study area. The experiment was laid out as a randomized complete block design (RCBD) in a factorial arrangement in the 2016, 2017, and 2018 main cropping seasons. In this experiment, four nitrogen fertilizer rates; 0, 23, 46, and 69 Kg/ha nitrogen, and three improved varieties (Setit-1, Setit-2, and Humera-1). In the meantime, growth, yield, and yield components of sesame were collected from each treatment. Coefficient of determination (R2), Root mean square error (RMSE), Normalized root mean square error (N-RMSE), Model efficiency (E), and Degree of agreement (D) were used to test the performance of the model. The results indicated that the AquaCrop model successfully simulated soil water content with R2 varying from 0.92 to 0.98, RMSE 6.5 to 13.9 mm, E 0.78 to 0.94, and D 0.95 to 0.99; and the corresponding values for AB also varied from 0.92 to 0.98, 0.33 to 0.54 tons/ha, 0.74 to 0.93, and 0.9 to 0.98, respectively. The results on the canopy cover of sesame also showed that the model acceptably simulated canopy cover with R2 varying from 0.95 to 0.99, and a RMSE of 5.3 to 8.6%. The AquaCrop model was appropriately calibrated to simulate soil water content, canopy cover, aboveground biomass, and sesame yield; the results indicated that the model adequately simulated the growth and yield of sesame under the different nitrogen fertilizer levels. The AquaCrop model might be an important tool for improved soil fertility management and yield enhancement strategies of sesame. Hence, the model might be applied as a decision-support tool in soil fertility management in sesame production.

Keywords: aquacrop model, sesame, normalized water productivity, nitrogen fertilizer

Procedia PDF Downloads 55
7049 Combined Surface Tension and Natural Convection of Nanofluids in a Square Open Cavity

Authors: Habibis Saleh, Ishak Hashim

Abstract:

Combined surface tension and natural convection heat transfer in an open cavity is studied numerically in this article. The cavity is filled with water-{Cu} nanofluids. The left wall is kept at low temperature, the right wall at high temperature and the bottom and top walls are adiabatic. The top free surface is assumed to be flat and non--deformable. Finite difference method is applied to solve the dimensionless governing equations. It is found that the insignificant effect of adding the nanoparticles were obtained about $Ma_{bf}=250$.

Keywords: natural convection, marangoni convection, nanofluids, square open cavity

Procedia PDF Downloads 534
7048 The Impact of Mergers and Acquisitions on Financial Deepening in the Nigerian Banking Sector

Authors: Onyinyechi Joy Kingdom

Abstract:

Mergers and Acquisitions (M&A) have been proposed as a mechanism through which, problems associated with inefficiency or poor performance in financial institution could be addressed. The aim of this study is to examine the proposition that recapitalization of banks, which encouraged Mergers and Acquisitions in Nigeria banking system, would strengthen the domestic banks, improve financial deepening and the confidence of depositors. Hence, this study examines the impact of the 2005 M&A in the Nigerian-banking sector on financial deepening using mixed method (quantitative and qualitative approach). The quantitative process of this study utilised annual time series for financial deepening indicator for the period of 1997 to 2012. While, the qualitative aspect adopted semi-structured interview to collate data from three merged banks and three stand-alone banks to explore, understand and complement the quantitative results. Furthermore, a framework thematic analysis is employed to analyse the themes developed using NVivo 11 software. Using the quantitative approach, findings from the equality of mean test (EMT) used suggests that M&A have significant impact on financial deepening. However, this method is not robust enough given its weak validity as it does not control for other potential factors that may determine financial deepening. Thus, to control for other factors that may affect the level of financial deepening, a Multiple Regression Model (MRM) and Interrupted Times Series Analysis (ITSA) were applied. The coefficient for M&A dummy turned negative and insignificant using MRM. In addition, the estimated linear trend of the post intervention when ITSA was applied suggests that after M&A, the level of financial deepening decreased annually; however, this was statistically insignificant. Similarly, using the qualitative approach, the results from the interview supported the quantitative results from ITSA and MRM. The result suggests that interest rate should fall when capital base is increased to improve financial deepening. Hence, this study contributes to the existing literature the importance of other factors that may affect financial deepening and the economy when policies that will enhance bank performance and the economy are made. In addition, this study will enable the use of valuable policy instruments relevant to monetary authorities when formulating policies that will strengthen the Nigerian banking sector and the economy.

Keywords: mergers and acquisitions, recapitalization, financial deepening, efficiency, financial crisis

Procedia PDF Downloads 379
7047 Single and Sequential Extraction for Potassium Fractionation and Nano-Clay Flocculation Structure

Authors: Chakkrit Poonpakdee, Jing-Hua Tzen, Ya-Zhen Huang, Yao-Tung Lin

Abstract:

Potassium (K) is a known macro nutrient and essential element for plant growth. Single leaching and modified sequential extraction schemes have been developed to estimate the relative phase associations of soil samples. The sequential extraction process is a step in analyzing the partitioning of metals affected by environmental conditions, but it is not a tool for estimation of K bioavailability. While, traditional single leaching method has been used to classify K speciation for a long time, it depend on its availability to the plants and use for potash fertilizer recommendation rate. Clay mineral in soil is a factor for controlling soil fertility. The change of the micro-structure of clay minerals during various environment (i.e. swelling or shrinking) is characterized using Transmission X-Ray Microscopy (TXM). The objective of this study are to 1) compare the distribution of K speciation between single leaching and sequential extraction process 2) determined clay particle flocculation structure before/after suspension with K+ using TXM. Four tropical soil samples: farming without K fertilizer (10 years), long term applied K fertilizer (10 years; 168-240 kg K2O ha-1 year-1), red soil (450-500 kg K2O ha-1 year-1) and forest soil were selected. The results showed that the amount of K speciation by single leaching method were high in mineral K, HNO3 K, Non-exchangeable K, NH4OAc K, exchangeable K and water soluble K respectively. Sequential extraction process indicated that most K speciations in soil were associated with residual, organic matter, Fe or Mn oxide and exchangeable fractions and K associate fraction with carbonate was not detected in tropical soil samples. In farming long term applied K fertilizer and red soil were higher exchangeable K than farming long term without K fertilizer and forest soil. The results indicated that one way to increase the available K (water soluble K and exchangeable K) should apply K fertilizer and organic fertilizer for providing available K. The two-dimension of TXM image of clay particles suspension with K+ shows that the aggregation structure of clay mineral closed-void cellular networks. The porous cellular structure of soil aggregates in 1 M KCl solution had large and very larger empty voids than in 0.025 M KCl and deionized water respectively. TXM nanotomography is a new technique can be useful in the field as a tool for better understanding of clay mineral micro-structure.

Keywords: potassium, sequential extraction process, clay mineral, TXM

Procedia PDF Downloads 271
7046 Implementation of Efficiency and Energy Conservation Concept in Office Building as an Effort to Achieve Green Office Building Case Studies Office Building in Jakarta

Authors: Jarwa Prasetya Sih Handoko

Abstract:

The issue of energy crisis for big cities in Indonesia are issues raised in line with the development of the city is rapidly increasing. Various attempts were made by the government in overcoming problems of energy needs in Indonesia. In addition to the efforts of the government required the efforts made by the public to solve this problem. The concept of green building in the design of the building with efforts to use energy efficiently can be one of the efforts that can be applied to solve this problem. Jakarta is capital and the one of the major cities in Indonesia with high economic growth. This leads to increased demand for office space for the people. So that the construction of office buildings in big cities like Jakarta very numerous. Office building is one of the buildings that require large energy consumption. As a building that could potentially require huge amounts of energy, the design should consider the use of energy to help provide solutions to problems of energy crisis in Indonesia. The concept of energy efficient is one of the concepts addressed in an effort to use energy in buildings to save energy needs of the building operations. Therefore, it is necessary to have a study that explores the application of the concept of energy efficiency and conservation in office buildings in Jakarta. In this study using two (2) buildings case study that Sequis Center Building and Sampoerna Strategic Square. Both are office buildings in Jakarta have earned the Green Building Certificate of Green Building Council Indonesia (GBCI). The study used literature review methods to address issues raised earlier. Whether it's related to a literature review on the study of office buildings and green building. With this paper is expected to be obtained on the application of the concept of energy efficiency and conservation in office buildings that have earned recognition as a green building by GBCI. The result could be a reference to the architect in designing the next office buildings, especially related to the concept of energy use in buildings. From this study, it can be concluded that the concept of energy efficiency and conservation in the design of office buildings can be applied to its orientation, the openings, the use shade in buildings, vegetation and building material selection and efficient use of water. So that it can reduce energy requirements needed to meet the needs of the building user activity. So the concept of energy efficiency and conservation in office buildings can be one of the efforts to realize the Green Office Building. Recommendations from this study is that the design of office buildings should be able to apply the concept of energy utilization in the design office. This is to meet the energy needs of the office buildings in an effort to realize the Green Building.

Keywords: energy crisis, energy efficiency, energy conservation, green building, office building

Procedia PDF Downloads 285
7045 Effects of Surface Textures and Chemistries on Wettability

Authors: Dipti Raj, Himanshu Mishra

Abstract:

Wetting of a solid surface by a liquid is an extremely common yet subtle phenomenon in natural and applied sciences. A clear understanding of both short and long-term wetting behaviors of surfaces is essential for creating robust anti-biofouling coatings, non-wetting textiles, non-fogging mirrors, and preventive linings against dirt and icing. In this study, silica beads (diameter, D ≈ 100 μm) functionalized using different silane reagents were employed to modify the wetting characteristics of smooth polydimethylsiloxane (PDMS) surfaces. Resulting composite surfaces were found to be super-hydrophobic, i.e. contact angle of water,

Keywords: contact angle, Cassie-Baxter, PDMS, silica, texture, wetting

Procedia PDF Downloads 237
7044 Nurse-Led Codes: Practical Application in the Emergency Department during a Global Pandemic

Authors: F. DelGaudio, H. Gill

Abstract:

Resuscitation during cardiopulmonary (CPA) arrest is dynamic, high stress, high acuity situation, which can easily lead to communication breakdown, and errors. The care of these high acuity patients has also been shown to increase physiologic stress and task saturation of providers, which can negatively impact the care being provided. These difficulties are further complicated during a global pandemic and pose a significant safety risk to bedside providers. Nurse-led codes are a relatively new concept that may be a potential solution for alleviating some of these difficulties. An experienced nurse who has completed advanced cardiac life support (ACLS), and additional training, assumed the responsibility of directing the mechanics of the appropriate ACLS algorithm. This was done in conjunction with a physician who also acted as a physician leader. The additional nurse-led code training included a multi-disciplinary in situ simulation of a CPA on a suspected COVID-19 patient. During the CPA, the nurse leader’s responsibilities include: ensuring adequate compression depth and rate, minimizing interruptions in chest compressions, the timing of rhythm/pulse checks, and appropriate medication administration. In addition, the nurse leader also functions as a last line safety check for appropriate personal protective equipment and limiting exposure of staff. The use of nurse-led codes for CPA has shown to decrease the cognitive overload and task saturation for the physician, as well as limiting the number of staff being exposed to a potentially infectious patient. The real-world application has allowed physicians to perform and oversee high-risk procedures such as intubation, line placement, and point of care ultrasound, without sacrificing the integrity of the resuscitation. Nurse-led codes have also given the physician the bandwidth to review pertinent medical history, advanced directives, determine reversible causes, and have the end of life conversations with family. While there is a paucity of research on the effectiveness of nurse-led codes, there are many potentially significant benefits. In addition to its value during a pandemic, it may also be beneficial during complex circumstances such as extracorporeal cardiopulmonary resuscitation.

Keywords: cardiopulmonary arrest, COVID-19, nurse-led code, task saturation

Procedia PDF Downloads 131
7043 Characterization and Modelling of Aerosol Droplet in Absorption Columns

Authors: Hammad Majeed, Hanna Knuutila, Magne Hillestad, Hallvard F. Svendsen

Abstract:

Formation of aerosols can cause serious complications in industrial exhaust gas CO2 capture processes. SO3 present in the flue gas can cause aerosol formation in an absorption based capture process. Small mist droplets and fog formed can normally not be removed in conventional demisting equipment because their submicron size allows the particles or droplets to follow the gas flow. As a consequence of this aerosol based emissions in the order of grams per Nm3 have been identified from PCCC plants. In absorption processes aerosols are generated by spontaneous condensation or desublimation processes in supersaturated gas phases. Undesired aerosol development may lead to amine emissions many times larger than what would be encountered in a mist free gas phase in PCCC development. It is thus of crucial importance to understand the formation and build-up of these aerosols in order to mitigate the problem. Rigorous modelling of aerosol dynamics leads to a system of partial differential equations. In order to understand mechanics of a particle entering an absorber an implementation of the model is created in Matlab. The model predicts the droplet size, the droplet internal variable profiles and the mass transfer fluxes as function of position in the absorber. The Matlab model is based on a subclass method of weighted residuals for boundary value problems named, orthogonal collocation method. The model comprises a set of mass transfer equations for transferring components and the essential diffusion reaction equations to describe the droplet internal profiles for all relevant constituents. Also included is heat transfer across the interface and inside the droplet. This paper presents results describing the basic simulation tool for the characterization of aerosols formed in CO2 absorption columns and gives examples as to how various entering droplets grow or shrink through an absorber and how their composition changes with respect to time. Below are given some preliminary simulation results for an aerosol droplet composition and temperature profiles.

Keywords: absorption columns, aerosol formation, amine emissions, internal droplet profiles, monoethanolamine (MEA), post combustion CO2 capture, simulation

Procedia PDF Downloads 228
7042 Functions and Challenges of New County-Based Regional Plan in Taiwan

Authors: Yu-Hsin Tsai

Abstract:

A new, mandated county regional plan system has been initiated since 2010 nationwide in Taiwan, with its role situated in-between the policy-led cross-county regional plan and the blueprint-led city plan. This new regional plan contain both urban and rural areas in one single plan, which provides a more complete planning territory, i.e., city region within the county’s jurisdiction, and to be executed and managed effectively by the county government. However, the full picture of its functions and characteristics seems still not totally clear, compared with other levels of plans; either are planning goals and issues that can be most appropriately dealt with at this spatial scale. In addition, the extent to which the inclusion of sustainability ideal and measures to cope with climate change are unclear. Based on the above issues, this study aims to clarify the roles of county regional plan, to analyze the extent to which the measures cope with sustainability, climate change, and forecasted declining population, and the success factors and issues faced in the planning process. The methodology applied includes literature review, plan quality evaluation, and interview with officials of the central and local governments and urban planners involved for all the 23 counties in Taiwan. The preliminary research results show, first, growth management related policies have been widely implemented and expected to have effective impact, including incorporating resources capacity to determine maximum population for the city region as a whole, developing overall vision of urban growth boundary for all the whole city region, prioritizing infill development, and use of architectural land within urbanized area over rural area to cope with urban growth. Secondly, planning-oriented zoning is adopted in urban areas, while demand-oriented planning permission is applied in the rural areas with designated plans. Then, public participation has been evolved to the next level to oversee all of government’s planning and review processes due to the decreasing trust in the government, and development of public forum on the internet etc. Next, fertile agricultural land is preserved to maintain food self-supplied goal for national security concern. More adoption-based methods than mitigation-based methods have been applied to cope with global climate change. Finally, better land use and transportation planning in terms of avoiding developing rail transit stations and corridor in rural area is promoted. Even though many promising, prompt measures have been adopted, however, challenges exist to surround: first, overall urban density, likely affecting success of UGB, or use of rural agricultural land, has not been incorporated, possibly due to implementation difficulties. Second, land-use related measures to mitigating climate change seem less clear and hence less employed. Smart decline has not drawn enough attention to cope with predicted population decrease in the next decade. Then, some reluctance from county’s government to implement county regional plan can be observed vaguely possibly since limits have be set on further development on agricultural land and sensitive areas. Finally, resolving issue on existing illegal factories on agricultural land remains the most challenging dilemma.

Keywords: city region plan, sustainability, global climate change, growth management

Procedia PDF Downloads 333
7041 Lipase-Mediated Formation of Peroxyoctanoic Acid Used in Catalytic Epoxidation of α-Pinene

Authors: N. Wijayati, Kusoro Siadi, Hanny Wijaya, Maggy Thenawijjaja Suhartono

Abstract:

This work describes the lipase-mediated synthesis of α-pinene oxide at ambient temperature. The immobilized lipase from Pseudomonas aeruginosa is used to generate peroxyoctanoic acid directly from octanoic acid and hydrogen peroxide. The peroxy acid formed is then applied for in situ oxidation of α-pinene. High conversion of α-pinene to α-pinene oxide (approximately 78%) was achieved when using 0,1 g enzim lipase, 6 mmol H2O2, dan 5 mmol octanoic acid. Various parameters affecting the conversion of α-pinene to α pinene oxide were studied.

Keywords: α-Pinene; P. aeruginosa; Octanoic acid

Procedia PDF Downloads 260