Search results for: automatic selective door operations
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3362

Search results for: automatic selective door operations

3092 Segmentation of Liver Using Random Forest Classifier

Authors: Gajendra Kumar Mourya, Dinesh Bhatia, Akash Handique, Sunita Warjri, Syed Achaab Amir

Abstract:

Nowadays, Medical imaging has become an integral part of modern healthcare. Abdominal CT images are an invaluable mean for abdominal organ investigation and have been widely studied in the recent years. Diagnosis of liver pathologies is one of the major areas of current interests in the field of medical image processing and is still an open problem. To deeply study and diagnose the liver, segmentation of liver is done to identify which part of the liver is mostly affected. Manual segmentation of the liver in CT images is time-consuming and suffers from inter- and intra-observer differences. However, automatic or semi-automatic computer aided segmentation of the Liver is a challenging task due to inter-patient Liver shape and size variability. In this paper, we present a technique for automatic segmenting the liver from CT images using Random Forest Classifier. Random forests or random decision forests are an ensemble learning method for classification that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes of the individual trees. After comparing with various other techniques, it was found that Random Forest Classifier provide a better segmentation results with respect to accuracy and speed. We have done the validation of our results using various techniques and it shows above 89% accuracy in all the cases.

Keywords: CT images, image validation, random forest, segmentation

Procedia PDF Downloads 283
3091 Design and Development of Automatic Onion Harvester

Authors: P. Revathi, T. Mrunalini, K. Padma Priya, P. Ramya, R. Saranya

Abstract:

During the tough times of covid, those people who were hospitalized found it difficult to always convey what they wanted to or needed to the attendee. Sometimes the attendees might also not be there. In that case, the patients can use simple hand gestures to control electrical appliances (like its set it for a zero watts bulb)and three other gestures for voice note intimation. In this AI-based hand recognition project, NodeMCU is used for the control action of the relay, and it is connected to the firebase for storing the value in the cloud and is interfaced with the python code via raspberry pi. For three hand gestures, a voice clip is added for intimation to the attendee. This is done with the help of Google’s text to speech and the inbuilt audio file option in the raspberry pi 4. All the 5 gestures will be detected when shown with their hands via a webcam which is placed for gesture detection. A personal computer is used for displaying the gestures and for running the code in the raspberry pi imager.

Keywords: onion harvesting, automatic pluging, camera, raspberry pi

Procedia PDF Downloads 170
3090 Short Answer Grading Using Multi-Context Features

Authors: S. Sharan Sundar, Nithish B. Moudhgalya, Nidhi Bhandari, Vineeth Vijayaraghavan

Abstract:

Automatic Short Answer Grading is one of the prime applications of artificial intelligence in education. Several approaches involving the utilization of selective handcrafted features, graphical matching techniques, concept identification and mapping, complex deep frameworks, sentence embeddings, etc. have been explored over the years. However, keeping in mind the real-world application of the task, these solutions present a slight overhead in terms of computations and resources in achieving high performances. In this work, a simple and effective solution making use of elemental features based on statistical, linguistic properties, and word-based similarity measures in conjunction with tree-based classifiers and regressors is proposed. The results for classification tasks show improvements ranging from 1%-30%, while the regression task shows a stark improvement of 35%. The authors attribute these improvements to the addition of multiple similarity scores to provide ensemble of scoring criteria to the models. The authors also believe the work could reinstate that classical natural language processing techniques and simple machine learning models can be used to achieve high results for short answer grading.

Keywords: artificial intelligence, intelligent systems, natural language processing, text mining

Procedia PDF Downloads 113
3089 Diagnosis and Analysis of Automated Liver and Tumor Segmentation on CT

Authors: R. R. Ramsheeja, R. Sreeraj

Abstract:

For view the internal structures of the human body such as liver, brain, kidney etc have a wide range of different modalities for medical images are provided nowadays. Computer Tomography is one of the most significant medical image modalities. In this paper use CT liver images for study the use of automatic computer aided techniques to calculate the volume of the liver tumor. Segmentation method is used for the detection of tumor from the CT scan is proposed. Gaussian filter is used for denoising the liver image and Adaptive Thresholding algorithm is used for segmentation. Multiple Region Of Interest(ROI) based method that may help to characteristic the feature different. It provides a significant impact on classification performance. Due to the characteristic of liver tumor lesion, inherent difficulties appear selective. For a better performance, a novel proposed system is introduced. Multiple ROI based feature selection and classification are performed. In order to obtain of relevant features for Support Vector Machine(SVM) classifier is important for better generalization performance. The proposed system helps to improve the better classification performance, reason in which we can see a significant reduction of features is used. The diagnosis of liver cancer from the computer tomography images is very difficult in nature. Early detection of liver tumor is very helpful to save the human life.

Keywords: computed tomography (CT), multiple region of interest(ROI), feature values, segmentation, SVM classification

Procedia PDF Downloads 486
3088 Surface Integrity Improvement for Selective Laser Melting (SLM) Additive Manufacturing of C300 Parts Using Ball Burnishing

Authors: Adrian Travieso Disotuar, J. Antonio Travieso Rodriguez, Ramon Jerez Mesa, Montserrat Vilaseca

Abstract:

The effect of the non-vibration-assisted and vibration-assisted ball burnishing on both the surface and mechanical properties of C300 obtained by Selective Laser Melting additive manufacturing technology is studied in this paper. Different vibration amplitudes preloads, and burnishing strategies were tested. A topographical analysis was performed to determine the surface roughness of the different conditions. Besides, micro tensile tests were carried out in situ on Scanning Electron Microscopy to elucidate the post-treatment effects on damaging mechanisms. Experiments show that vibration-assisted ball burnishing significantly enhances mechanical properties compared to the non-vibration-assisted method. Moreover, it was found that the surface roughness was significantly improved with respect to the reference surface.

Keywords: additive manufacturing, ball burnishing, mechanical properties, metals, surface roughness

Procedia PDF Downloads 41
3087 Parkinson’s Disease Hand-Eye Coordination and Dexterity Evaluation System

Authors: Wann-Yun Shieh, Chin-Man Wang, Ya-Cheng Shieh

Abstract:

This study aims to develop an objective scoring system to evaluate hand-eye coordination and hand dexterity for Parkinson’s disease. This system contains three boards, and each of them is implemented with the sensors to sense a user’s finger operations. The operations include the peg test, the block test, and the blind block test. A user has to use the vision, hearing, and tactile abilities to finish these operations, and the board will record the results automatically. These results can help the physicians to evaluate a user’s reaction, coordination, dexterity function. The results will be collected to a cloud database for further analysis and statistics. A researcher can use this system to obtain systematic, graphic reports for an individual or a group of users. Particularly, a deep learning model is developed to learn the features of the data from different users. This model will help the physicians to assess the Parkinson’s disease symptoms by a more intellective algorithm.

Keywords: deep learning, hand-eye coordination, reaction, hand dexterity

Procedia PDF Downloads 38
3086 A Paper Based Sensor for Mercury Ion Detection

Authors: Emine G. Cansu Ergun

Abstract:

Conjugated system based sensors for selective detection of metal ions have been taking attention during last two decades. Fluorescent sensors are the promising candidates for ion detection due to their high selectivity towards metal ions, and rapid response times. Detection of mercury in an environmenet is important since mercury is a toxic element for human. Beyond the maximum allowable limit, mercury may cause serious problems in human health by spreading into the atmosphere, water and the food chain. In this study, a quinoxaline and 3,4-ethylenedioxy thiophene based donor-acceptor-donor type conjugated molecule used as a fluorescent sensor for detecting the mercury ion in aqueous medium. Among other various cations, existence of mercury resulted in a full quenching of the fluorescence signal. Then, a paper based sensor is constructed and used for mercury detection. As a result it is concluded that the offering sensor is a good candidate for selective mercury detection in aqueous media both in solution and paper based forms.

Keywords: Conjugated molecules , fluorescence quenching, metal ion detection , sensors

Procedia PDF Downloads 131
3085 Bioproduction of L(+)-Lactic Acid and Purification by Ion Exchange Mechanism

Authors: Zelal Polat, Şebnem Harsa, Semra Ülkü

Abstract:

Lactic acid exists in nature optically in two forms, L(+), D(-)-lactic acid, and has been used in food, leather, textile, pharmaceutical and cosmetic industries. Moreover, L(+)-lactic acid constitutes the raw material for the production of poly-L-lactic acid which is used in biomedical applications. Microbially produced lactic acid was aimed to be recovered from the fermentation media efficiently and economically. Among the various downstream operations, ion exchange chromatography is highly selective and yields a low cost product recovery within a short period of time. In this project, Lactobacillus casei NRRL B-441 was used for the production of L(+)-lactic acid from whey by fermentation at pH 5.5 and 37°C that took 12 hours. The product concentration was 50 g/l with 100% L(+)-lactic acid content. Next, the suitable resin was selected due to its high sorption capacity with rapid equilibrium behavior. Dowex marathon WBA, weakly basic anion exchanger in OH form reached the equilibrium in 15 minutes. The batch adsorption experiments were done approximately at pH 7.0 and 30°C and sampling was continued for 20 hours. Furthermore, the effect of temperature and pH was investigated and their influence was found to be unimportant. All the adsorption/desorption experiments were applied to both model lactic acid and biomass free fermentation broth. The ion exchange equilibria of lactic acid and L(+)-lactic acid in fermentation broth on Dowex marathon WBA was explained by Langmuir isotherm. The maximum exchange capacity (qm) for model lactic acid was 0.25 g La/g wet resin and for fermentation broth 0.04 g La/g wet resin. The equilibrium loading and exchange efficiency of L(+)-lactic acid in fermentation broth were reduced as a result of competition by other ionic species. The competing ions inhibit the binding of L(+)-lactic acid to the free sites of ion exchanger. Moreover, column operations were applied to recover adsorbed lactic acid from the ion exchanger. 2.0 M HCl was the suitable eluting agent to recover the bound L(+)-lactic acid with a flowrate of 1 ml/min at ambient temperature. About 95% of bound L(+)-lactic acid was recovered from Dowex marathon WBA. The equilibrium was reached within 15 minutes. The aim of this project was to investigate the purification of L(+)-lactic acid with ion exchange method from fermentation broth. The additional goals were to investigate the end product purity, to obtain new data on the adsorption/desorption behaviours of lactic acid and applicability of the system in industrial usage.

Keywords: fermentation, ion exchange, lactic acid, purification, whey

Procedia PDF Downloads 480
3084 Concentrated Animal Feeding Operations and Planning in the United States: Evidences from North Carolina

Authors: Asmaa Benbaba

Abstract:

This paper aims to reconsider relationships between animal feeding operations (CAFOs) and planning. It stresses the idea of the necessity for a methodological revolution in order to increase the chances for dialogue between different actors and various planning agencies and create possibilities to manage conflicts. The explored case of North Carolina shows limitations in environmental agencies’ actions and methods. It also calls for a more integrated approach among agencies including the local agencies.

Keywords: CAFOs, North Carolina, planning, United States

Procedia PDF Downloads 377
3083 A New Approach to Interval Matrices and Applications

Authors: Obaid Algahtani

Abstract:

An interval may be defined as a convex combination as follows: I=[a,b]={x_α=(1-α)a+αb: α∈[0,1]}. Consequently, we may adopt interval operations by applying the scalar operation point-wise to the corresponding interval points: I ∙J={x_α∙y_α ∶ αϵ[0,1],x_α ϵI ,y_α ϵJ}, With the usual restriction 0∉J if ∙ = ÷. These operations are associative: I+( J+K)=(I+J)+ K, I*( J*K)=( I*J )* K. These two properties, which are missing in the usual interval operations, will enable the extension of the usual linear system concepts to the interval setting in a seamless manner. The arithmetic introduced here avoids such vague terms as ”interval extension”, ”inclusion function”, determinants which we encounter in the engineering literature that deal with interval linear systems. On the other hand, these definitions were motivated by our attempt to arrive at a definition of interval random variables and investigate the corresponding statistical properties. We feel that they are the natural ones to handle interval systems. We will enable the extension of many results from usual state space models to interval state space models. The interval state space model we will consider here is one of the form X_((t+1) )=AX_t+ W_t, Y_t=HX_t+ V_t, t≥0, where A∈ 〖IR〗^(k×k), H ∈ 〖IR〗^(p×k) are interval matrices and 〖W 〗_t ∈ 〖IR〗^k,V_t ∈〖IR〗^p are zero – mean Gaussian white-noise interval processes. This feeling is reassured by the numerical results we obtained in a simulation examples.

Keywords: interval analysis, interval matrices, state space model, Kalman Filter

Procedia PDF Downloads 399
3082 Optimized Control of Roll Stability of Missile using Genetic Algorithm

Authors: Pham Van Hung, Nguyen Trong Hieu, Le Quoc Dinh, Nguyen Kiem Chien, Le Dinh Hieu

Abstract:

The article focuses on the study of automatic flight control on missiles during operation. The quality standards and characteristics of missile operations are very strict, requiring high stability and accurate response to commands within a relatively wide range of work. The study analyzes the linear transfer function model of the Missile Roll channel to facilitate the development of control systems. A two-loop control structure for the Missile Roll channel is proposed, with the inner loop controlling the Missile Roll rate and the outer loop controlling the Missile Roll angle. To determine the optimal control parameters, a genetic algorithm is applied. The study uses MATLAB simulation software to implement the genetic algorithm and evaluate the quality of the closed-loop system. The results show that the system achieves better quality than the original structure and is simple, reliable, and ready for implementation in practical experiments.

Keywords: genetic algorithm, roll chanel, two-loop control structure, missile

Procedia PDF Downloads 59
3081 Survey Research Assessment for Renewable Energy Integration into the Mining Industry

Authors: Kateryna Zharan, Jan C. Bongaerts

Abstract:

Mining operations are energy intensive, and the share of energy costs in total costs is often quoted in the range of 40 %. Saving on energy costs is, therefore, a key element of any mine operator. With the improving reliability and security of renewable energy (RE) sources, and requirements to reduce carbon dioxide emissions, perspectives for using RE in mining operations emerge. These aspects are stimulating the mining companies to search for ways to substitute fossil energy with RE. Hereby, the main purpose of this study is to present the survey research assessment in matter of finding out the key issues related to the integration of RE into mining activities, based on the mining and renewable energy experts’ opinion. The purpose of the paper is to present the outcomes of a survey conducted among mining and renewable energy experts about the feasibility of RE in mining operations. The survey research has been developed taking into consideration the following categories: first of all, the mining and renewable energy experts were chosen based on the specific criteria. Secondly, they were offered a questionnaire to gather their knowledge and opinions on incentives for mining operators to turn to RE, barriers and challenges to be expected, environmental effects, appropriate business models and the overall impact of RE on mining operations. The outcomes of the survey allow for the identification of factors which favor and disfavor decision-making on the use of RE in mining operations. It concludes with a set of recommendations for further study. One of them relates to a deeper analysis of benefits for mining operators when using RE, and another one suggests that appropriate business models considering economic and environmental issues need to be studied and developed. The results of the paper will be used for developing a hybrid optimized model which might be adopted at mines according to their operation processes as well as economic and environmental perspectives.

Keywords: carbon dioxide emissions, mining industry, photovoltaic, renewable energy, survey research, wind generation

Procedia PDF Downloads 335
3080 A Study on Design for Parallel Test Based on Embedded System

Authors: Zheng Sun, Weiwei Cui, Xiaodong Ma, Hongxin Jin, Dongpao Hong, Jinsong Yang, Jingyi Sun

Abstract:

With the improvement of the performance and complexity of modern equipment, automatic test system (ATS) becomes widely used for condition monitoring and fault diagnosis. However, the conventional ATS mainly works in a serial mode, and lacks the ability of testing several equipments at the same time. That leads to low test efficiency and ATS redundancy. Especially for a large majority of equipment under test, the conventional ATS cannot meet the requirement of efficient testing. To reduce the support resource and increase test efficiency, we propose a method of design for the parallel test based on the embedded system in this paper. Firstly, we put forward the general framework of the parallel test system, and the system contains a central management system (CMS) and several distributed test subsystems (DTS). Then we give a detailed design of the system. For the hardware of the system, we use embedded architecture to design DTS. For the software of the system, we use test program set to improve the test adaption. By deploying the parallel test system, the time to test five devices is now equal to the time to test one device in the past. Compared with the conventional test system, the proposed test system reduces the size and improves testing efficiency. This is of great significance for equipment to be put into operation swiftly. Finally, we take an industrial control system as an example to verify the effectiveness of the proposed method. The result shows that the method is reasonable, and the efficiency is improved up to 500%.

Keywords: parallel test, embedded system, automatic test system, automatic test system (ATS), central management system, central management system (CMS), distributed test subsystems, distributed test subsystems (DTS)

Procedia PDF Downloads 268
3079 Factor Analysis of Self-Efficacy among Traniees in the National Service for the Healthy Lifestyle Program

Authors: Nuzsep Almigo, Md Amin Md Taff, Yusop Ahmad, Norkhalid Salimin, Gunathevan Elumalai

Abstract:

This research aimed to determine the level of self-efficacy in obese trainees before and after the Healthy Lifestyle Program. Self-efficacy is defined as the feeling, belief, perception, belief in the ability to cope with a particular situation that will influence the way individuals cope with the situation. Research instrument used was self efficacy questionnaire consisting of four main factors: (i) cognitive (abilities in a positive and realistic attitudes to the potential of to perform the duties, restrictions, or social desire), (ii) effective (mental management ability, feeling and mood), (iii) motivation (determination and the level of ability to achieve the purpose or goal), and (iv) selective (ability to choose the social conditions confronting and adapting to situations). The study sample consisted of 118 trainees from Healthy Lifestyle Program. The analysis showed there was a significant difference in self-efficacy before and after the Healthy Lifestyle Program (p = 0.00) indicated by increasing self-efficacy in the program.

Keywords: self efficacy, self-confidence, affective, motivation, selective

Procedia PDF Downloads 399
3078 Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore

Authors: Ronal Muresano, Andrea Pagano

Abstract:

Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.

Keywords: algorithm optimization, bank failures, OpenMP, parallel techniques, statistical tool

Procedia PDF Downloads 344
3077 Distribution of Traffic Volume at Fuel Station during Peak Hour Period on Arterial Road

Authors: Surachai Ampawasuvan, Supornchai Utainarumol

Abstract:

Most of fuel station’ customers, who drive on the major arterial road wants to use the stations to fill fuel to their vehicle during their journey to destinations. According to the survey of traffic volume of the vehicle using fuel stations by video cameras, automatic counting tools, or questionnaires, it was found that most users prefer to use fuel stations on holiday rather than on working day. They also prefer to use fuel stations in the morning rather than in the evening. When comparing the ratio of the distribution pattern of traffic volume of the vehicle using fuel stations by video cameras, automatic counting tools, there is no significant difference. However, when comparing the ratio of peak hour (peak hour rate) of the results from questionnaires at 13 to 14 percent with the results obtained by using the methods of the Institute of Transportation Engineering (ITE), it is found that the value is similar. However, it is different from a survey by video camera and automatic traffic counting at 6 to 7 percent of about half. So, this study suggests that in order to forecast trip generation of vehicle using fuel stations on major arterial road which is mostly characterized by Though Traffic, it is recommended to use the value of half of peak hour rate, which would make the forecast for trips generation to be more precise and accurate and compatible to surrounding environment.

Keywords: peak rate, trips generation, fuel station, arterial road

Procedia PDF Downloads 373
3076 Hounsfield-Based Automatic Evaluation of Volumetric Breast Density on Radiotherapy CT-Scans

Authors: E. M. D. Akuoko, Eliana Vasquez Osorio, Marcel Van Herk, Marianne Aznar

Abstract:

Radiotherapy is an integral part of treatment for many patients with breast cancer. However, side effects can occur, e.g., fibrosis or erythema. If patients at higher risks of radiation-induced side effects could be identified before treatment, they could be given more individual information about the risks and benefits of radiotherapy. We hypothesize that breast density is correlated with the risk of side effects and present a novel method for automatic evaluation based on radiotherapy planning CT scans. Methods: 799 supine CT scans of breast radiotherapy patients were available from the REQUITE dataset. The methodology was first established in a subset of 114 patients (cohort 1) before being applied to the whole dataset (cohort 2). All patients were scanned in the supine position, with arms up, and the treated breast (ipsilateral) was identified. Manual experts contour available in 96 patients for both the ipsilateral and contralateral breast in cohort 1. Breast tissue was segmented using atlas-based automatic contouring software, ADMIRE® v3.4 (Elekta AB, Sweden). Once validated, the automatic segmentation method was applied to cohort 2. Breast density was then investigated by thresholding voxels within the contours, using Otsu threshold and pixel intensity ranges based on Hounsfield units (-200 to -100 for fatty tissue, and -99 to +100 for fibro-glandular tissue). Volumetric breast density (VBD) was defined as the volume of fibro-glandular tissue / (volume of fibro-glandular tissue + volume of fatty tissue). A sensitivity analysis was performed to verify whether calculated VBD was affected by the choice of breast contour. In addition, we investigated the correlation between volumetric breast density (VBD) and patient age and breast size. VBD values were compared between ipsilateral and contralateral breast contours. Results: Estimated VBD values were 0.40 (range 0.17-0.91) in cohort 1, and 0.43 (0.096-0.99) in cohort 2. We observed ipsilateral breasts to be denser than contralateral breasts. Breast density was negatively associated with breast volume (Spearman: R=-0.5, p-value < 2.2e-16) and age (Spearman: R=-0.24, p-value = 4.6e-10). Conclusion: VBD estimates could be obtained automatically on a large CT dataset. Patients’ age or breast volume may not be the only variables that explain breast density. Future work will focus on assessing the usefulness of VBD as a predictive variable for radiation-induced side effects.

Keywords: breast cancer, automatic image segmentation, radiotherapy, big data, breast density, medical imaging

Procedia PDF Downloads 108
3075 Automatic Registration of Rail Profile Based Local Maximum Curvature Entropy

Authors: Hao Wang, Shengchun Wang, Weidong Wang

Abstract:

On the influence of train vibration and environmental noise on the measurement of track wear, we proposed a method for automatic extraction of circular arc on the inner or outer side of the rail waist and achieved the high-precision registration of rail profile. Firstly, a polynomial fitting method based on truncated residual histogram was proposed to find the optimal fitting curve of the profile and reduce the influence of noise on profile curve fitting. Then, based on the curvature distribution characteristics of the fitting curve, the interval search algorithm based on dynamic window’s maximum curvature entropy was proposed to realize the automatic segmentation of small circular arc. At last, we fit two circle centers as matching reference points based on small circular arcs on both sides and realized the alignment from the measured profile to the standard designed profile. The static experimental results show that the mean and standard deviation of the method are controlled within 0.01mm with small measurement errors and high repeatability. The dynamic test also verified the repeatability of the method in the train-running environment, and the dynamic measurement deviation of rail wear is within 0.2mm with high repeatability.

Keywords: curvature entropy, profile registration, rail wear, structured light, train-running

Procedia PDF Downloads 230
3074 Hybrid Artificial Bee Colony and Least Squares Method for Rule-Based Systems Learning

Authors: Ahcene Habbi, Yassine Boudouaoui

Abstract:

This paper deals with the problem of automatic rule generation for fuzzy systems design. The proposed approach is based on hybrid artificial bee colony (ABC) optimization and weighted least squares (LS) method and aims to find the structure and parameters of fuzzy systems simultaneously. More precisely, two ABC based fuzzy modeling strategies are presented and compared. The first strategy uses global optimization to learn fuzzy models, the second one hybridizes ABC and weighted least squares estimate method. The performances of the proposed ABC and ABC-LS fuzzy modeling strategies are evaluated on complex modeling problems and compared to other advanced modeling methods.

Keywords: automatic design, learning, fuzzy rules, hybrid, swarm optimization

Procedia PDF Downloads 409
3073 Promotional Effects of Zn in Cu-Zn/Core-Shell Al-MCM-41 for Selective Catalytic Reduction of NO with NH3: Acidic Properties, NOx Adsorption Properties, and Nature of Copper

Authors: Thidarat Imyen, Paisan Kongkachuichay

Abstract:

Cu-Zn/core-shell Al-MCM-41 catalyst with various copper species, prepared by a combination of three methods—substitution, ion-exchange, and impregnation, was studied for the selective catalytic reduction (SCR) of NO with NH3 at 300 °C for 150 min. In order to investigate the effects of Zn introduction on the nature of the catalyst, Cu/core-shell Al-MCM-41 and Zn/core-shell Al-MCM-41 catalysts were also studied. The roles of Zn promoter in the acidity and the NOx adsorption properties of the catalysts were investigated by in situ Fourier transform infrared spectroscopy (FTIR) of NH3 and NOx adsorption, and temperature-programmed desorption (TPD) of NH3 and NOx. The results demonstrated that the acidity of the catalyst was enhanced by the Zn introduction, as exchanged Zn(II) cations loosely bonded with Al-O-Si framework could create Brønsted acid sites by interacting with OH groups. Moreover, Zn species also provided the additional sites for NO adsorption in the form of nitrite (NO2–) and nitrate (NO3–) species, which are the key intermediates for SCR reaction. In addition, the effect of Zn on the nature of copper was studied by in situ FTIR of CO adsorption and in situ X-ray adsorption near edge structure (XANES). It was found that Zn species hindered the reduction of Cu(II) to Cu(0), resulting in higher Cu(I) species in the Zn promoted catalyst. The Cu-Zn/core-shell Al-MCM-41 exhibited higher catalytic activity compared with that of the Cu/core-shell Al-MCM-41 for the whole reaction time, as it possesses the highest amount of Cu(I) sites, which are responsible for SCR catalytic activity. The Cu-Zn/core-shell Al-MCM-41 catalyst also reached the maximum NO conversion of 100% with the average NO conversion of 76 %. The catalytic performance of the catalyst was further improved by using Zn promoter in the form of ZnO instead of reduced Zn species. The Cu-ZnO/core-shell Al-MCM-41 catalyst showed better catalytic performance with longer working reaction time, and achieved the average NO conversion of 81%.

Keywords: Al-MCM-41, copper, nitrogen oxide, selective catalytic reduction, zinc

Procedia PDF Downloads 267
3072 Hairy Beggarticks (Bidens pilosa L. - Asteraceae) Control in Sunflower Fields Using Pre-Emergence Herbicides

Authors: Alexandre M. Brighenti

Abstract:

One of the most damaging species in sunflower crops in Brazil is the hairy beggarticks (Bidens pilosa L.). The large number of seeds, the various vegetative cycles during the year, the staggered germination and the scarcity of selective and effective herbicides to control this weed in sunflower are some of attributes that hinder the effectiveness in controlling hairy beggarticks populations. The experiment was carried out with the objectives of evaluating the control of hairy beggarticks plants in sunflower crops, and to assess sunflower tolerance to residual herbicides. The treatments were as follows: S-metolachlor (1,200 and 2,400 g ai ha-1), flumioxazin (60 and 120 g ai ha-1), sulfentrazone (150 and 300 g ai ha-1) and two controls (weedy and weed-free check). Phytotoxicity on sunflower plants, percentage of control and density of hairy beggarticks plants, sunflower stand and plant height, head diameter, oil content and sunflower yield were evaluated. The herbicides flumioxazin and sulfentrazone were the most efficient in hairy beggarticks control. S-metolachlor provided acceptable control levels. S-metolachlor (1,200 g ha-1), flumioxazin (60 g ha-1) and sulfentrazone (150 g ha-1) were the most selective doses for sunflower crop.

Keywords: flumioxazin, Helianthus annuus, S-metolachlor, sulfentrazone, weeds

Procedia PDF Downloads 320
3071 Influence of Internal Topologies on Components Produced by Selective Laser Melting: Numerical Analysis

Authors: C. Malça, P. Gonçalves, N. Alves, A. Mateus

Abstract:

Regardless of the manufacturing process used, subtractive or additive, material, purpose and application, produced components are conventionally solid mass with more or less complex shape depending on the production technology selected. Aspects such as reducing the weight of components, associated with the low volume of material required and the almost non-existent material waste, speed and flexibility of production and, primarily, a high mechanical strength combined with high structural performance, are competitive advantages in any industrial sector, from automotive, molds, aviation, aerospace, construction, pharmaceuticals, medicine and more recently in human tissue engineering. Such features, properties and functionalities are attained in metal components produced using the additive technique of Rapid Prototyping from metal powders commonly known as Selective Laser Melting (SLM), with optimized internal topologies and varying densities. In order to produce components with high strength and high structural and functional performance, regardless of the type of application, three different internal topologies were developed and analyzed using numerical computational tools. The developed topologies were numerically submitted to mechanical compression and four point bending testing. Finite Element Analysis results demonstrate how different internal topologies can contribute to improve mechanical properties, even with a high degree of porosity relatively to fully dense components. Results are very promising not only from the point of view of mechanical resistance, but especially through the achievement of considerable variation in density without loss of structural and functional high performance.

Keywords: additive manufacturing, internal topologies, porosity, rapid prototyping, selective laser melting

Procedia PDF Downloads 306
3070 Trends in Incisional and Ventral Hernia Repair: A Population Analysis from 2001 to 2021

Authors: Lakmali Anthony, Madeline Gillies

Abstract:

Background: Incisional and ventral hernias are highly prevalent, with primary ventral hernias occurring in approximately 20% of adults and incisional hernias developing in up to 30% of midline abdominal incisions. Recent data from the United States have shown an increasing incidence of elective incisional and ventral hernia repair (IVHR) and emergency repair of complicated hernias. This study examines Australian population trends in IVHR over a two-decade study period. Methods: This retrospective study was performed using procedure data from the Australian Institute of Health and Welfare, and population data from the Australian Bureau of Statistics captured between 2000 and 2021 to calculate incidence rates per 100,000 population by age and sex for selected subcategories of IVHR operations. Trends over time were evaluated using simple linear regression. Results: There were 809,308 IVHR operations performed in Australia during the study period. The cumulative incidence adjusted for the population was 182 per 100,000; this increased by 9.578 per year during the study period (95% CI = 8.431- 10.726, p<.001). IVHR for primary umbilical hernias experienced the most significant increase in population-adjusted incidence, 1.177 per year. (95% CI = 0.654- 1.701, p<.001). Emergency IVHR for incarcerated, obstructed, and strangulated hernias increased by 0.576 per year (95% CI = 0.510 -0.642, p<.001). Only 20.2% of IVHR procedures were performed as day surgery. Conclusions: Australia has seen a significant increase in IVHR operations performed in the last 20 years, particularly those for primary ventral hernias. IVHR for hernias complicated by incarceration, obstruction, and strangulation also increased significantly. The proportion of IVHR operations performed as day surgery is well below the target set by the Royal Australasian College of Surgeons. With the increasing incidence of IVHR operations and an increasing proportion of these being emergent, elective IVHR should be performed as day surgery when it is safe.

Keywords: ventral, incisional, hernia, trends

Procedia PDF Downloads 45
3069 Words Spotting in the Images Handwritten Historical Documents

Authors: Issam Ben Jami

Abstract:

Information retrieval in digital libraries is very important because most famous historical documents occupy a significant value. The word spotting in historical documents is a very difficult notion, because automatic recognition of such documents is naturally cursive, it represents a wide variability in the level scale and translation words in the same documents. We first present a system for the automatic recognition, based on the extraction of interest points words from the image model. The extraction phase of the key points is chosen from the representation of the image as a synthetic description of the shape recognition in a multidimensional space. As a result, we use advanced methods that can find and describe interesting points invariant to scale, rotation and lighting which are linked to local configurations of pixels. We test this approach on documents of the 15th century. Our experiments give important results.

Keywords: feature matching, historical documents, pattern recognition, word spotting

Procedia PDF Downloads 246
3068 Theorizing Optimal Use of Numbers and Anecdotes: The Science of Storytelling in Newsrooms

Authors: Hai L. Tran

Abstract:

When covering events and issues, the news media often employ both personal accounts as well as facts and figures. However, the process of using numbers and narratives in the newsroom is mostly operated through trial and error. There is a demonstrated need for the news industry to better understand the specific effects of storytelling and data-driven reporting on the audience as well as explanatory factors driving such effects. In the academic world, anecdotal evidence and statistical evidence have been studied in a mutually exclusive manner. Existing research tends to treat pertinent effects as though the use of one form precludes the other and as if a tradeoff is required. Meanwhile, narratives and statistical facts are often combined in various communication contexts, especially in news presentations. There is value in reconceptualizing and theorizing about both relative and collective impacts of numbers and narratives as well as the mechanism underlying such effects. The current undertaking seeks to link theory to practice by providing a complete picture of how and why people are influenced by information conveyed through quantitative and qualitative accounts. Specifically, the cognitive-experiential theory is invoked to argue that humans employ two distinct systems to process information. The rational system requires the processing of logical evidence effortful analytical cognitions, which are affect-free. Meanwhile, the experiential system is intuitive, rapid, automatic, and holistic, thereby demanding minimum cognitive resources and relating to the experience of affect. In certain situations, one system might dominate the other, but rational and experiential modes of processing operations in parallel and at the same time. As such, anecdotes and quantified facts impact audience response differently and a combination of data and narratives is more effective than either form of evidence. In addition, the present study identifies several media variables and human factors driving the effects of statistics and anecdotes. An integrative model is proposed to explain how message characteristics (modality, vividness, salience, congruency, position) and individual differences (involvement, numeracy skills, cognitive resources, cultural orientation) impact selective exposure, which in turn activates pertinent modes of processing, and thereby induces corresponding responses. The present study represents a step toward bridging theoretical frameworks from various disciplines to better understand the specific effects and the conditions under which the use of anecdotal evidence and/or statistical evidence enhances or undermines information processing. In addition to theoretical contributions, this research helps inform news professionals about the benefits and pitfalls of incorporating quantitative and qualitative accounts in reporting. It proposes a typology of possible scenarios and appropriate strategies for journalists to use when presenting news with anecdotes and numbers.

Keywords: data, narrative, number, anecdote, storytelling, news

Procedia PDF Downloads 57
3067 Co-Operation in Hungarian Agriculture

Authors: Eszter Hamza

Abstract:

The competitiveness of economic operators is based on interoperability, which is relatively low in Hungary. The development of co-operation is high priority in Common Agricultural Policy 2014-2020. The aim of the paper to assess co-operations in Hungarian agriculture, estimate the economic outputs and benefits of co-operations, based on statistical data processing and literature. Further objective is to explore the potential of agricultural co-operation with the help of interviews and questionnaire survey. The research seeks to answer questions as to what fundamental factors play role in the development of co-operation, and what are the motivations of the actors and the key success factors and pitfalls. The results were analysed using econometric methods. In Hungarian agriculture we can find several forms of co-operation: cooperatives, producer groups (PG) and producer organizations (PO), machinery cooperatives, integrator companies, product boards and interbranch organisations. Despite the several appearance of the agricultural co-operation, their economic weight is significantly lower in Hungary than in western European countries. Considering the agricultural importance, the integrator companies represent the most weight among the co-operations forms. Hungarian farmers linked to co-operations or organizations mostly in relation to procurement and sales. Less than 30 percent of surveyed farmers are members of a producer organization or cooperative. The trust level is low among farmers. The main obstacle to the development of formalized co-operation, is producers' risk aversion and the black economy in agriculture. Producers often prefer informal co-operation instead of long-term contractual relationships. The Hungarian agricultural co-operations are characterized by non-dynamic development, but slow qualitative change. For the future, one breakout point could be the association of producer groups and organizations, which in addition to the benefits of market concentration, in the dissemination of knowledge, advisory network operation and innovation can act more effectively.

Keywords: agriculture, co-operation, producer organisation, trust level

Procedia PDF Downloads 364
3066 Capacity Building on Small Automatic Tracking Antenna Development for Thailand Space Sustainability

Authors: Warinthorn Kiadtikornthaweeyot Evans, Nawattakorn Kaikaew

Abstract:

The communication system between the ground station and the satellite is very important to guarantee contact between both sides. Thailand, led by Geo-Informatics and Space Technology Development Agency (GISTDA), has received satellite images from other nation's satellites for a number of years. In 2008, Thailand Earth Observation Satellite (THEOS) was the first Earth observation satellite owned by Thailand. The mission was monitoring our country with affordable access to space-based Earth imagery. At this time, the control ground station was initially used to control the THEOS satellite by our Thai engineers. The Tele-commands were sent to the satellite according to requests from government and private sectors. Since then, GISTDA's engineers have gained their skill and experience to operate the satellite. Recently the desire to use satellite data is increasing rapidly due to space technology moving fast and giving us more benefits. It is essential to ensure that Thailand remains competitive in space technology. Thai Engineers have started to improve the performance of the control ground station in many different sections, also developing skills and knowledge in areas of satellite communication. Human resource skills are being enforced with development projects through capacity building. This paper focuses on the hands-on capacity building of GISTDA's engineers to develop a small automatic tracking antenna. The final achievement of the project is the first phase prototype of a small automatic tracking antenna to support the new technology of the satellites. There are two main subsystems that have been developed and tested; the tracking system and the monitoring and control software. The prototype first phase functions testing has been performed with Two Line Element (TLE) and the mission planning plan (MPP) file calculated from THEOS satellite by GISTDA.

Keywords: capacity building, small tracking antenna, automatic tracking system, project development procedure

Procedia PDF Downloads 49
3065 Matrix Method Posting

Authors: Varong Pongsai

Abstract:

The objective of this paper is introducing a new method of accounting posting which is called Matrix Method Posting. This method is based on the Matrix operation of pure Mathematics. Although, accounting field is classified as one of the social-science knowledge, many of accounting operations are placed by Mathematics sign and operation. Through the operation applying, it seems to be that the operations of Mathematics should be applied to accounting possibly. So, this paper tries to over-lap Mathematics logic to accounting logic smoothly. According to the context of discovery, deductive approach is employed to prove a simultaneously logical concept of both Mathematics and Accounting. The result proves that the Matrix can be placed to operate accounting perfectly, because Matrix and accounting logic also have a similarity concept which is balancing 2 sides during operations. Moreover, the Matrix posting also has a lot of benefit. It can help financial analyst calculating financial ratios comfortably. Furthermore, the matrix determinant which is a signature operation itself also helps auditors checking out the correction of clients’ recording. If the determinant is not equaled to 0, it will point out that the recording process of clients getting into the problem. Finally, the Matrix should be easily determining a concept of merger and consolidation far beyond the present day concept.

Keywords: matrix method posting, deductive approach, determinant, accounting application

Procedia PDF Downloads 340
3064 A Preliminary Study for Design of Automatic Block Reallocation Algorithm with Genetic Algorithm Method in the Land Consolidation Projects

Authors: Tayfun Çay, Yasar İnceyol, Abdurrahman Özbeyaz

Abstract:

Land reallocation is one of the most important steps in land consolidation projects. Many different models were proposed for land reallocation in the literature such as Fuzzy Logic, block priority based land reallocation and Spatial Decision Support Systems. A model including four parts is considered for automatic block reallocation with genetic algorithm method in land consolidation projects. These stages are preparing data tables for a project land, determining conditions and constraints of land reallocation, designing command steps and logical flow chart of reallocation algorithm and finally writing program codes of Genetic Algorithm respectively. In this study, we designed the first three steps of the considered model comprising four steps.

Keywords: land consolidation, landholding, land reallocation, optimization, genetic algorithm

Procedia PDF Downloads 399
3063 From Problem Space to Executional Architecture: The Development of a Simulator to Examine the Effect of Autonomy on Mainline Rail Capacity

Authors: Emily J. Morey, Kevin Galvin, Thomas Riley, R. Eddie Wilson

Abstract:

The key challenges faced by integrating autonomous rail operations into the existing mainline railway environment have been identified through the understanding and framing of the problem space and stakeholder analysis. This was achieved through the completion of the first four steps of Soft Systems Methodology, where the problem space has been expressed via conceptual models. Having identified these challenges, we investigated one of them, namely capacity, via the use of models and simulation. This paper examines the approach used to move from the conceptual models to a simulation which can determine whether the integration of autonomous trains can plausibly increase capacity. Within this approach, we developed an architecture and converted logical models into physical resource models and associated design features which were used to build a simulator. From this simulator, we are able to analyse mixtures of legacy-autonomous operations and produce fundamental diagrams and trajectory plots to describe the dynamic behaviour of mixed mainline railway operations.

Keywords: autonomy, executable architecture, modelling and simulation, railway capacity

Procedia PDF Downloads 54