Search results for: Signal processing
237 Study of Forging Process in 7075 Aluminum Alloy Professional Bicycle Pedal using Taguchi Method
Authors: Dyi-Cheng Chen, Wen-Hsuan Ku, Ming-Ren Chen
Abstract:
The current of professional bicycle pedal-s manufacturing model mostly used casting, forging, die-casting processing methods, so the paper used 7075 aluminum alloy which is to produce the bicycle parts most commonly, and employs the rigid-plastic finite element (FE) DEFORMTM 3D software to simulate and to analyze the professional bicycle pedal design. First we use Solid works 2010 3D graphics software to design the professional bicycle pedal of the mold and appearance, then import finite element (FE) DEFORMTM 3D software for analysis. The paper used rigid-plastic model analytical methods, and assuming mode to be rigid body. A series of simulation analyses in which the variables depend on different temperature of forging billet, friction factors, forging speed, mold temperature are reveal to effective stress, effective strain, damage and die radial load distribution for forging bicycle pedal. The analysis results hope to provide professional bicycle pedal forming mold references to identified whether suit with the finite element results for high-strength design suitability of aluminum alloy.Keywords: Bicycle pedal, finite element analysis, 7075 aluminum alloy, Taguchi method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2791236 Influence of κ-Casein Genotype on Milk Productivity of Latvia Local Dairy Breeds
Authors: S. Petrovska, D. Jonkus, D. Smiltiņa
Abstract:
κ-casein is one of milk proteins which are very important for milk processing. Genotypes of κ-casein affect milk yield, fat, and protein content. The main factors which affect local Latvian dairy breed milk yield and composition are analyzed in research. Data were collected from 88 Latvian brown and 82 Latvian blue cows in 2015. AA genotype was 0.557 in Latvian brown and 0.232 in Latvian blue breed. BB genotype was 0.034 in Latvian brown and 0.207 in Latvian blue breed. Highest milk yield was observed in Latvian brown (5131.2 ± 172.01 kg), significantly high fat content and fat yield also was in Latvian brown (p < 0.05). Significant differences between κ-casein genotypes were not found in Latvian brown, but highest milk yield (5057 ± 130.23 kg), protein content (3.42 ± 0.03%), and protein yield (171.9 ± 4.34 kg) were with AB genotype. Significantly high fat content was observed in Latvian blue breed with BB genotype (4.29 ± 0.17%) compared with AA genotypes (3.42 ± 0.19). Similar tendency was found in protein content – 3.27 ± 0.16% with BB genotype and 2.59 ± 0.16% with AA genotype (p < 0.05). Milk yield increases by increasing parity. We did not obtain major tendency of changes of milk fat and protein content according parity.
Keywords: κ-casein, polymorphism, dairy cows, milk productivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1168235 On Combining Support Vector Machines and Fuzzy K-Means in Vision-based Precision Agriculture
Authors: A. Tellaeche, X. P. Burgos-Artizzu, G. Pajares, A. Ribeiro
Abstract:
One important objective in Precision Agriculture is to minimize the volume of herbicides that are applied to the fields through the use of site-specific weed management systems. In order to reach this goal, two major factors need to be considered: 1) the similar spectral signature, shape and texture between weeds and crops; 2) the irregular distribution of the weeds within the crop's field. This paper outlines an automatic computer vision system for the detection and differential spraying of Avena sterilis, a noxious weed growing in cereal crops. The proposed system involves two processes: image segmentation and decision making. Image segmentation combines basic suitable image processing techniques in order to extract cells from the image as the low level units. Each cell is described by two area-based attributes measuring the relations among the crops and the weeds. From these attributes, a hybrid decision making approach determines if a cell must be or not sprayed. The hybrid approach uses the Support Vector Machines and the Fuzzy k-Means methods, combined through the fuzzy aggregation theory. This makes the main finding of this paper. The method performance is compared against other available strategies.Keywords: Fuzzy k-Means, Precision agriculture, SupportVectors Machines, Weed detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1779234 Potential of Safflower (Carthamus tinctorius L.) for Phytoremedation of Soils Contaminated with Heavy Metals
Authors: Violina R. Angelova, Vanja I. Akova, Stefan V. Krustev, Krasimir I. Ivanov
Abstract:
A field study was conducted to evaluate the efficacy of safflower plant for phytoremediation of contaminated soils. The experiment was performed on an agricultural fields contaminated by the Non-Ferrous-Metal Works near Plovdiv, Bulgaria. Field experiments with randomized complete block design with five treatments (control, compost amendments added at 20 and 40 t/daa, and vermicompost amendments added at 20 and 40 t/daa) were carried out. The quality of safflower seeds and oil (heavy metals and fatty acid composition) were determined. Tested organic amendments significantly influenced the chemical composition of safflower seeds and oil. The compost and vermicompost treatments significantly reduced heavy metals concentration in safflower seeds and oils, but the effect differed among them. Addition of vermicompost and compost leads to an increase in the content of palmitic acid and linoleic acid, and a decrease in the stearic and oleic acids compared with the control. A significant increase in the quantity of saturated acids was observed in the variants with 20 t/daa of compost and 20 t/daa of vermicompost (9.1 and 8.9% relative to the control). Safflower is a plant which is tolerant to heavy metals and can be successfully used in the phytoremediation of heavy metal contaminated soils. The processing of seeds to oil and using the obtained oil for nutritional purposes will greatly reduce the cost of phytoremediation.Keywords: Heavy metals, organic amendments, phytoremediation, safflower.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2792233 In Silico Analysis of Pax6 Interacting Proteins Indicates Missing Molecular Links in Development of Brain and Associated Disease
Authors: Ratnakar Tripathi, Rajnikant Mishra
Abstract:
The PAX6, a transcription factor, is essential for the morphogenesis of the eyes, brain, pituitary and pancreatic islets. In rodents, the loss of Pax6 function leads to central nervous system defects, anophthalmia, and nasal hypoplasia. The haplo-insufficiency of Pax6 causes microphthalmia, aggression and other behavioral abnormalities. It is also required in brain patterning and neuronal plasticity. In human, heterozygous mutation of Pax6 causes loss of iris [aniridia], mental retardation and glucose intolerance. The 3- deletion in Pax6 leads to autism and aniridia. The phenotypes are variable in peneterance and expressivity. However, mechanism of function and interaction of PAX6 with other proteins during development and associated disease are not clear. It is intended to explore interactors of PAX6 to elucidated biology of PAX6 function in the tissues where it is expressed and also in the central regulatory pathway. This report describes In-silico approaches to explore interacting proteins of PAX6. The models show several possible proteins interacting with PAX6 like MITF, SIX3, SOX2, SOX3, IPO13, TRIM, and OGT. Since the Pax6 is a critical transcriptional regulator and master control gene of eye and brain development it might be interacting with other protein involved in morphogenesis [TGIF, TGF, Ras etc]. It is also presumed that matricelluar proteins [SPARC, thrombospondin-1 and osteonectin etc] are likely to interact during transport and processing of PAX6 and are somewhere its cascade. The proteins involved in cell survival and cell proliferation can also not be ignored.
Keywords: Interacting Proteins, Pax6, PIP, STRING
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966232 Sliding Mode Power System Stabilizer for Synchronous Generator Stability Improvement
Authors: J. Ritonja, R. Brezovnik, M. Petrun, B. Polajžer
Abstract:
Many modern synchronous generators in power systems are extremely weakly damped. The reasons are cost optimization of the machine building and introduction of the additional control equipment into power systems. Oscillations of the synchronous generators and related stability problems of the power systems are harmful and can lead to failures in operation and to damages. The only useful solution to increase damping of the unwanted oscillations represents the implementation of the power system stabilizers. Power system stabilizers generate the additional control signal which changes synchronous generator field excitation voltage. Modern power system stabilizers are integrated into static excitation systems of the synchronous generators. Available commercial power system stabilizers are based on linear control theory. Due to the nonlinear dynamics of the synchronous generator, current stabilizers do not assure optimal damping of the synchronous generator’s oscillations in the entire operating range. For that reason the use of the robust power system stabilizers which are convenient for the entire operating range is reasonable. There are numerous robust techniques applicable for the power system stabilizers. In this paper the use of sliding mode control for synchronous generator stability improvement is studied. On the basis of the sliding mode theory, the robust power system stabilizer was developed. The main advantages of the sliding mode controller are simple realization of the control algorithm, robustness to parameter variations and elimination of disturbances. The advantage of the proposed sliding mode controller against conventional linear controller was tested for damping of the synchronous generator oscillations in the entire operating range. Obtained results show the improved damping in the entire operating range of the synchronous generator and the increase of the power system stability. The proposed study contributes to the progress in the development of the advanced stabilizer, which will replace conventional linear stabilizers and improve damping of the synchronous generators.
Keywords: Control theory, power system stabilizer, robust control, sliding mode control, stability, synchronous generator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1061231 A 10 Giga VPN Accelerator Board for Trust Channel Security System
Authors: Ki Hyun Kim, Jang-Hee Yoo, Kyo Il Chung
Abstract:
This paper proposes a VPN Accelerator Board (VPN-AB), a virtual private network (VPN) protocol designed for trust channel security system (TCSS). TCSS supports safety communication channel between security nodes in internet. It furnishes authentication, confidentiality, integrity, and access control to security node to transmit data packets with IPsec protocol. TCSS consists of internet key exchange block, security association block, and IPsec engine block. The internet key exchange block negotiates crypto algorithm and key used in IPsec engine block. Security Association blocks setting-up and manages security association information. IPsec engine block treats IPsec packets and consists of networking functions for communication. The IPsec engine block should be embodied by H/W and in-line mode transaction for high speed IPsec processing. Our VPN-AB is implemented with high speed security processor that supports many cryptographic algorithms and in-line mode. We evaluate a small TCSS communication environment, and measure a performance of VPN-AB in the environment. The experiment results show that VPN-AB gets a performance throughput of maximum 15.645Gbps when we set the IPsec protocol with 3DES-HMAC-MD5 tunnel mode.Keywords: TCSS(Trust Channel Security System), VPN(VirtualPrivate Network), IPsec, SSL, Security Processor, Securitycommunication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2098230 Learning Classifier Systems Approach for Automated Discovery of Crisp and Fuzzy Hierarchical Production Rules
Authors: Suraiya Jabin, Kamal K. Bharadwaj
Abstract:
This research presents a system for post processing of data that takes mined flat rules as input and discovers crisp as well as fuzzy hierarchical structures using Learning Classifier System approach. Learning Classifier System (LCS) is basically a machine learning technique that combines evolutionary computing, reinforcement learning, supervised or unsupervised learning and heuristics to produce adaptive systems. A LCS learns by interacting with an environment from which it receives feedback in the form of numerical reward. Learning is achieved by trying to maximize the amount of reward received. Crisp description for a concept usually cannot represent human knowledge completely and practically. In the proposed Learning Classifier System initial population is constructed as a random collection of HPR–trees (related production rules) and crisp / fuzzy hierarchies are evolved. A fuzzy subsumption relation is suggested for the proposed system and based on Subsumption Matrix (SM), a suitable fitness function is proposed. Suitable genetic operators are proposed for the chosen chromosome representation method. For implementing reinforcement a suitable reward and punishment scheme is also proposed. Experimental results are presented to demonstrate the performance of the proposed system.Keywords: Hierarchical Production Rule, Data Mining, Learning Classifier System, Fuzzy Subsumption Relation, Subsumption matrix, Reinforcement Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1456229 A Life Cycle Assessment (LCA) of Aluminum Production Process
Authors: Alaa Al Hawari, Mohammad Khader, Wael El Hasan, Mahmoud Alijla, Ammar Manawi, Abdelbaki Benamour
Abstract:
The production of aluminum alloys and ingots – starting from the processing of alumina to aluminum, and the final cast product – was studied using a Life Cycle Assessment (LCA) approach. The studied aluminum supply chain consisted of a carbon plant, a reduction plant, a casting plant, and a power plant. In the LCA model, the environmental loads of the different plants for the production of 1 ton of aluminum metal were investigated. The impact of the aluminum production was assessed in eight impact categories. The results showed that for all of the impact categories the power plant had the highest impact only in the cases of Human Toxicity Potential (HTP) the reduction plant had the highest impact and in the Marine Aquatic Eco-Toxicity Potential (MAETP) the carbon plant had the highest impact. Furthermore, the impact of the carbon plant and the reduction plant combined was almost the same as the impact of the power plant in the case of the Acidification Potential (AP). The carbon plant had a positive impact on the environment when it come to the Eutrophication Potential (EP) due to the production of clean water in the process. The natural gas based power plant used in the case study had 8.4 times less negative impact on the environment when compared to the heavy fuel based power plant and 10.7 times less negative impact when compared to the hard coal based power plant.
Keywords: Life cycle assessment, aluminum production, Supply chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4647228 Accurate Visualization of Graphs of Functions of Two Real Variables
Authors: Zeitoun D. G., Thierry Dana-Picard
Abstract:
The study of a real function of two real variables can be supported by visualization using a Computer Algebra System (CAS). One type of constraints of the system is due to the algorithms implemented, yielding continuous approximations of the given function by interpolation. This often masks discontinuities of the function and can provide strange plots, not compatible with the mathematics. In recent years, point based geometry has gained increasing attention as an alternative surface representation, both for efficient rendering and for flexible geometry processing of complex surfaces. In this paper we present different artifacts created by mesh surfaces near discontinuities and propose a point based method that controls and reduces these artifacts. A least squares penalty method for an automatic generation of the mesh that controls the behavior of the chosen function is presented. The special feature of this method is the ability to improve the accuracy of the surface visualization near a set of interior points where the function may be discontinuous. The present method is formulated as a minimax problem and the non uniform mesh is generated using an iterative algorithm. Results show that for large poorly conditioned matrices, the new algorithm gives more accurate results than the classical preconditioned conjugate algorithm.
Keywords: Function singularities, mesh generation, point allocation, visualization, collocation least squares method, Augmented Lagrangian method, Uzawa's Algorithm, Preconditioned Conjugate Gradien
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708227 Physico-Mechanical Properties of Jute-Coir Fiber Reinforced Hybrid Polypropylene Composites
Authors: Salma Siddika, Fayeka Mansura, Mahbub Hasan
Abstract:
The term hybrid composite refers to the composite containing more than one type of fiber material as reinforcing fillers. It has become attractive structural material due to the ability of providing better combination of properties with respect to single fiber containing composite. The eco-friendly nature as well as processing advantage, light weight and low cost have enhanced the attraction and interest of natural fiber reinforced composite. The objective of present research is to study the mechanical properties of jute-coir fiber reinforced hybrid polypropylene (PP) composite according to filler loading variation. In the present work composites were manufactured by using hot press machine at four levels of fiber loading (5, 10, 15 and 20 wt %). Jute and coir fibers were utilized at a ratio of (1:1) during composite manufacturing. Tensile, flexural, impact and hardness tests were conducted for mechanical characterization. Tensile test of composite showed a decreasing trend of tensile strength and increasing trend of the Young-s modulus with increasing fiber content. During flexural, impact and hardness tests, the flexural strength, flexural modulus, impact strength and hardness were found to be increased with increasing fiber loading. Based on the fiber loading used in this study, 20% fiber reinforced composite resulted the best set of mechanical properties.Keywords: Mechanical Properties; Coir, Jute, Polypropylene, Hybrid Composite.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3700226 A Review on Medical Image Registration Techniques
Authors: Shadrack Mambo, Karim Djouani, Yskandar Hamam, Barend van Wyk, Patrick Siarry
Abstract:
This paper discusses the current trends in medical image registration techniques and addresses the need to provide a solid theoretical foundation for research endeavours. Methodological analysis and synthesis of quality literature was done, providing a platform for developing a good foundation for research study in this field which is crucial in understanding the existing levels of knowledge. Research on medical image registration techniques assists clinical and medical practitioners in diagnosis of tumours and lesion in anatomical organs, thereby enhancing fast and accurate curative treatment of patients. Literature review aims to provide a solid theoretical foundation for research endeavours in image registration techniques. Developing a solid foundation for a research study is possible through a methodological analysis and synthesis of existing contributions. Out of these considerations, the aim of this paper is to enhance the scientific community’s understanding of the current status of research in medical image registration techniques and also communicate to them, the contribution of this research in the field of image processing. The gaps identified in current techniques can be closed by use of artificial neural networks that form learning systems designed to minimise error function. The paper also suggests several areas of future research in the image registration.Keywords: Image registration techniques, medical images, neural networks, optimisation, transformation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1812225 Object Recognition on Horse Riding Simulator System
Authors: Kyekyung Kim, Sangseung Kang, Suyoung Chi, Jaehong Kim
Abstract:
In recent years, IT convergence technology has been developed to get creative solution by combining robotics or sports science technology. Object detection and recognition have mainly applied to sports science field that has processed by recognizing face and by tracking human body. But object detection and recognition using vision sensor is challenge task in real world because of illumination. In this paper, object detection and recognition using vision sensor applied to sports simulator has been introduced. Face recognition has been processed to identify user and to update automatically a person athletic recording. Human body has tracked to offer a most accurate way of riding horse simulator. Combined image processing has been processed to reduce illumination adverse affect because illumination has caused low performance in detection and recognition in real world application filed. Face has recognized using standard face graph and human body has tracked using pose model, which has composed of feature nodes generated diverse face and pose images. Face recognition using Gabor wavelet and pose recognition using pose graph is robust to real application. We have simulated using ETRI database, which has constructed on horse riding simulator.
Keywords: Horse riding simulator, Object detection, Object recognition, User identification, Pose recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2088224 Automatic Detection of Breast Tumors in Sonoelastographic Images Using DWT
Authors: A. Sindhuja, V. Sadasivam
Abstract:
Breast Cancer is the most common malignancy in women and the second leading cause of death for women all over the world. Earlier the detection of cancer, better the treatment. The diagnosis and treatment of the cancer rely on segmentation of Sonoelastographic images. Texture features has not considered for Sonoelastographic segmentation. Sonoelastographic images of 15 patients containing both benign and malignant tumorsare considered for experimentation.The images are enhanced to remove noise in order to improve contrast and emphasize tumor boundary. It is then decomposed into sub-bands using single level Daubechies wavelets varying from single co-efficient to six coefficients. The Grey Level Co-occurrence Matrix (GLCM), Local Binary Pattern (LBP) features are extracted and then selected by ranking it using Sequential Floating Forward Selection (SFFS) technique from each sub-band. The resultant images undergo K-Means clustering and then few post-processing steps to remove the false spots. The tumor boundary is detected from the segmented image. It is proposed that Local Binary Pattern (LBP) from the vertical coefficients of Daubechies wavelet with two coefficients is best suited for segmentation of Sonoelastographic breast images among the wavelet members using one to six coefficients for decomposition. The results are also quantified with the help of an expert radiologist. The proposed work can be used for further diagnostic process to decide if the segmented tumor is benign or malignant.
Keywords: Breast Cancer, Segmentation, Sonoelastography, Tumor Detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2207223 Structural and Optical Properties of Silver Sulfide-Reduced Graphene Oxide Nanocomposite
Authors: Oyugi Ngure Robert, Tabitha A. Amollo, Kallen Mulilo Nalyanya
Abstract:
Nanomaterials have attracted significant attention in research because of their exemplary properties, making them suitable for diverse applications. This paper reports the successful synthesis as well as the structural and optical properties of silver sulfide-reduced graphene oxide (Ag2S-rGO) nanocomposite. The nanocomposite was synthesized by the chemical reduction method. Scanning electron microscopy (SEM) showed that the reduced graphene oxide (rGO) sheets were intercalated within the Ag2S nanoparticles during the chemical reduction process. The SEM images also showed that Ag2S had the shape of nanowires. Further, SEM energy dispersive X-ray (SEM EDX) showed that Ag2S-rGO is mainly composed of C, Ag, O, and S. X-ray diffraction analysis manifested a high crystallinity for the nanowire-shaped Ag2S nanoparticles with a d-spacing ranging between 1.0 Å and 5.2 Å. Thermal gravimetric analysis (TGA) showed that rGO enhances the thermal stability of the nanocomposite. Ag2S-rGO nanocomposite exhibited strong optical absorption in the UV region. The formed nanocomposite is dispersible in polar and non-polar solvents, qualifying it for solution-based device processing. Thus, the surface plasmon resonance effect associated with metallic nanoparticles, strong optical absorption, thermal stability crystallinity and hydrophilicity of the nanocomposite suits it for solar energy conversion applications.
Keywords: Silver sulfide, reduced graphene oxide, nanocomposite, structural properties, optical properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29222 Mobile Robot Control by Von Neumann Computer
Authors: E. V. Larkin, T. A. Akimenko, A. V. Bogomolov, A. N. Privalov
Abstract:
The digital control system of mobile robots (MR) control is considered. It is shown that sequential interpretation of control algorithm operators, unfolding in physical time, suggests the occurrence of time delays between inputting data from sensors and outputting data to actuators. Another destabilizing control factor is presence of backlash in the joints of an actuator with an executive unit. Complex model of control system, which takes into account the dynamics of the MR, the dynamics of the digital controller and backlash in actuators, is worked out. The digital controller model is divided into two parts: the first part describes the control law embedded in the controller in the form of a control program that realizes a polling procedure when organizing transactions to sensors and actuators. The second part of the model describes the time delays that occur in the Von Neumann-type controller when processing data. To estimate time intervals, the algorithm is represented in the form of an ergodic semi-Markov process. For an ergodic semi-Markov process of common form, a method is proposed for estimation a wandering time from one arbitrary state to another arbitrary state. Example shows how the backlash and time delays affect the quality characteristics of the MR control system functioning.
Keywords: Mobile robot, backlash, control algorithm, Von Neumann controller, semi-Markov process, time delay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 366221 Non-Overlapping Hierarchical Index Structure for Similarity Search
Authors: Mounira Taileb, Sid Lamrous, Sami Touati
Abstract:
In order to accelerate the similarity search in highdimensional database, we propose a new hierarchical indexing method. It is composed of offline and online phases. Our contribution concerns both phases. In the offline phase, after gathering the whole of the data in clusters and constructing a hierarchical index, the main originality of our contribution consists to develop a method to construct bounding forms of clusters to avoid overlapping. For the online phase, our idea improves considerably performances of similarity search. However, for this second phase, we have also developed an adapted search algorithm. Our method baptized NOHIS (Non-Overlapping Hierarchical Index Structure) use the Principal Direction Divisive Partitioning (PDDP) as algorithm of clustering. The principle of the PDDP is to divide data recursively into two sub-clusters; division is done by using the hyper-plane orthogonal to the principal direction derived from the covariance matrix and passing through the centroid of the cluster to divide. Data of each two sub-clusters obtained are including by a minimum bounding rectangle (MBR). The two MBRs are directed according to the principal direction. Consequently, the nonoverlapping between the two forms is assured. Experiments use databases containing image descriptors. Results show that the proposed method outperforms sequential scan and SRtree in processing k-nearest neighbors.
Keywords: K-nearest neighbour search, multi-dimensional indexing, multimedia databases, similarity search.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562220 Static and Dynamic Three-Dimensional Finite Element Analysis of Pelvic Bone
Authors: M. S. El-Asfoury, M. A. El-Hadek
Abstract:
The complex shape of the human pelvic bone was successfully imaged and modeled using finite element FE processing. The bone was subjected to quasi-static and dynamic loading conditions simulating the effect of both weight gain and impact. Loads varying between 500 – 2500 N (~50 – 250 Kg of weight) was used to simulate 3D quasi-static weight gain. Two different 3D dynamic analyses, body free fall at two different heights (1 and 2 m) and forced side impact at two different velocities (20 and 40 Km/hr) were also studied. The computed resulted stresses were compared for the four loading cases, where Von Misses stresses increases linearly with the weight gain increase under quasi-static loading. For the dynamic models, the Von Misses stress history behaviors were studied for the affected area and effected load with respect to time. The normalization Von Misses stresses with respect to the applied load were used for comparing the free fall and the forced impact load results. It was found that under the forced impact loading condition an over lapping behavior was noticed, where as for the free fall the normalized Von Misses stresses behavior was found to nonlinearly different. This phenomenon was explained through the energy dissipation concept. This study will help designers in different specialization in defining the weakest spots for designing different supporting systems.Keywords: Pelvic Bone, Static and Dynamic Analysis, Three- Dimensional Finite Element Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2141219 Artificial Intelligence: A Comprehensive and Systematic Literature Review of Applications and Comparative Technologies
Authors: Z. M. Najmi
Abstract:
Over the years, the question around Artificial Intelligence has always been one with many answers. Whether by means of use in business and industry or complicated algorithmic programming, management of these technologies has always been the core focus. More recently, technologies have been questioned in industry and society alike as to whether they have improved human-centred design, assisted choices and objectives, and had a hand in systematic processes across the board. With these questions the answer may lie within AI technologies, and the steps needed in removing common human error. Elements such as Machine Learning, Deep Learning, Recommender Systems and Natural Language Processing will all be features to consider moving forward. Our previous intervention with AI applications has resulted in increased productivity, however, raised concerns for the continuation of traditional human-centred occupations. Emerging technologies such as Augmented Reality and Virtual Reality have all played a part in this during AI’s prominent rise. As mentioned, AI has been constantly under the microscope; the benefits and drawbacks may seem endless is wide, but AI is something we must take notice of and adapt into our everyday lives. The aim of this paper is to give an overview of the technologies surrounding A.I. and its’ related technologies. A comprehensive review has been written as a timeline of the developing events and key points in the history of Artificial Intelligence. This research is gathered entirely from secondary research, academic statements of knowledge and gathered to produce an understanding of the timeline of AI.
Keywords: Artificial Intelligence, Deep Learning, Augmented Reality, Reinforcement Learning, Machine Learning, Supervised Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 576218 Processing and Assessment of Quality Characteristics of Composite Baby Foods
Authors: Reihaneh Ahmadzadeh Ghavidel, Mehdi Ghiafeh Davoodi
Abstract:
The usefulness of weaning foods to meet the nutrient needs of children is well recognized, and most of them are precooked roller dried mixtures of cereal and/or legume flours which posses a high viscosity and bulk when reconstituted. The objective of this study was to formulate composite weaning foods using cereals, malted legumes and vegetable powders and analyze them for nutrients, functional properties and sensory attributes. Selected legumes (green gram and lentil) were germinated, dried and dehulled. Roasted wheat, rice, carrot powder and skim milk powder also were used. All the ingredients were mixed in different proportions to get four formulations, made into 30% slurry and dried in roller drier. The products were analyzed for proximate principles, mineral content, functional and sensory qualities. The results of analysis showed following range of constituents per 100g of formulations on dry weight basis, protein, 18.1-18.9 g ; fat, 0.78-1.36 g ; iron, 5.09-6.53 mg; calcium, 265-310 mg. The lowest water absorption capacity was in case of wheat green gram based and the highest was in rice lentil based sample. Overall sensory qualities of all foods were graded as “good" and “very good" with no significant differences. The results confirm that formulated weaning foods were nutritionally superior, functionally appropriate and organoleptically acceptable.Keywords: malted legumes, weaning foods, nutrition, functional properties
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2084217 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.
Keywords: CNC machining, Six Sigma, Surface roughness, Taguchi methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1056216 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis
Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen
Abstract:
Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.
Keywords: Hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647215 Wildfires Assessed by Remote Sense Images and Burned Land Monitoring
Authors: M. C. Proença
Abstract:
The tools described in this paper enable the location of burned areas where took place the annihilation of natural habitats and establishes a baseline for major changes in forest ecosystems during recovery. Moreover, the result allows the follow up of the surface fuel loading, allowing the evaluation and guidance of restoration measures to remote areas by phased time planning. This case study implements the evaluation of burned areas that suffered successive wildfires in Portugal mainland during the summer of 2017, killing more than 60 people. The goal is to show that this evaluation can be done with remote sense data free of charges in a simple laptop, with open-source software, describing the not-so-simple methodology step by step, to make it accessible for local workers in the areas attained, where the availability of information is essential for the immediate planning of mitigation measures, such as restoring road access, allocate funds for the recovery of human dwellings and assess further needs for restoration of the ecological system. Wildfires also devastate forest ecosystems having a direct impact on vegetation cover and killing or driving away the animal population, besides loss of all crops in rural areas that are essential as local resources. The economic interests are also attained, as the pinewood burned becomes useless for the noblest applications, so its value decreases, and resin extraction ends for several years.
Keywords: Image processing, remote sensing, wildfires, burned areas, SENTINEL-2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1583214 Isolation and Screening of Fungal Strains for β-Galactosidase Production
Authors: Parmjit S. Panesar, Rupinder Kaur, Ram S. Singh
Abstract:
Enzymes are the biocatalysts which catalyze the biochemical processes and thus have a wide variety of applications in the industrial sector. β-Galactosidase (E.C. 3.2.1.23) also known as lactase, is one of the prime enzymes, which has significant potential in the dairy and food processing industries. It has the capability to catalyze both the hydrolytic reaction for the production of lactose hydrolyzed milk and transgalactosylation reaction for the synthesis of prebiotics such as lactulose and galactooligosaccharides. These prebiotics have various nutritional and technological benefits. Although, the enzyme is naturally present in almonds, peaches, apricots and other variety of fruits and animals, the extraction of enzyme from these sources increases the cost of enzyme. Therefore, focus has been shifted towards the production of low cost enzyme from the microorganisms such as bacteria, yeast and fungi. As compared to yeast and bacteria, fungal β-galactosidase is generally preferred as being extracellular and thermostable in nature. Keeping the above in view, the present study was carried out for the isolation of the β-galactosidase producing fungal strain from the food as well as the agricultural wastes. A total of more than 100 fungal cultures were examined for their potential in enzyme production. All the fungal strains were screened using X-gal and IPTG as inducers in the modified Czapek Dox Agar medium. Among the various isolated fungal strains, the strain exhibiting the highest enzyme activity was chosen for further phenotypic and genotypic characterization. The strain was identified as Rhizomucor pusillus on the basis of 5.8s RNA gene sequencing data.
Keywords: β-galactosidase, enzyme, fungus, isolation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2596213 Meaning Chasing Kiddies: Children-s Perception of Metaphors Used in Printed Advertisements
Authors: Asina Gülerarslan
Abstract:
Today-s children, who are born into a more colorful, more creative, more abstract and more accessible communication environment than their ancestors as a result of dizzying advances in technology, have an interesting capacity to perceive and make sense of the world. Millennium children, who live in an environment where all kinds of efforts by marketing communication are more intensive than ever are, from their early childhood on, subject to all kinds of persuasive messages. As regards advertising communication, it outperforms all the other marketing communication efforts in creating little consumer individuals and, as a result of processing of codes and signs, plays a significant part in building a world of seeing, thinking and understanding for children. Children who are raised with metaphorical expressions such as tales and riddles also meet that fast and effective meaning communication in advertisements. Children-s perception of metaphors, which help grasp the “product and its promise" both verbally and visually and facilitate association between them is the subject of this study. Stimulating and activating imagination, metaphors have unique advantages in promoting the product and its promise especially in regard to print advertisements, which have certain limitations. This study deals comparatively with both literal and metaphoric versions of print advertisements belonging to various product groups and attempts to discover to what extent advertisements are liked, recalled, perceived and are persuasive. The sample group of the study, which was conducted in two elementary schools situated in areas that had different socioeconomic features, consisted of children aged 12.Keywords: Children, metaphor, perception, print advertisements, recall.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1660212 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights
Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan
Abstract:
The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyse huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic wellbeing is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that support the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.
Keywords: COVID-19, big data, data analysis, indexing, NoSQL, sharding, scalability, poverty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 67211 Object Identification with Color, Texture, and Object-Correlation in CBIR System
Authors: Awais Adnan, Muhammad Nawaz, Sajid Anwar, Tamleek Ali, Muhammad Ali
Abstract:
Needs of an efficient information retrieval in recent years in increased more then ever because of the frequent use of digital information in our life. We see a lot of work in the area of textual information but in multimedia information, we cannot find much progress. In text based information, new technology of data mining and data marts are now in working that were started from the basic concept of database some where in 1960. In image search and especially in image identification, computerized system at very initial stages. Even in the area of image search we cannot see much progress as in the case of text based search techniques. One main reason for this is the wide spread roots of image search where many area like artificial intelligence, statistics, image processing, pattern recognition play their role. Even human psychology and perception and cultural diversity also have their share for the design of a good and efficient image recognition and retrieval system. A new object based search technique is presented in this paper where object in the image are identified on the basis of their geometrical shapes and other features like color and texture where object-co-relation augments this search process. To be more focused on objects identification, simple images are selected for the work to reduce the role of segmentation in overall process however same technique can also be applied for other images.Keywords: Object correlation, Geometrical shape, Color, texture, features, contents.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028210 Target Detection using Adaptive Progressive Thresholding Based Shifted Phase-Encoded Fringe-Adjusted Joint Transform Correlator
Authors: Inder K. Purohit, M. Nazrul Islam, K. Vijayan Asari, Mohammad A. Karim
Abstract:
A new target detection technique is presented in this paper for the identification of small boats in coastal surveillance. The proposed technique employs an adaptive progressive thresholding (APT) scheme to first process the given input scene to separate any objects present in the scene from the background. The preprocessing step results in an image having only the foreground objects, such as boats, trees and other cluttered regions, and hence reduces the search region for the correlation step significantly. The processed image is then fed to the shifted phase-encoded fringe-adjusted joint transform correlator (SPFJTC) technique which produces single and delta-like correlation peak for a potential target present in the input scene. A post-processing step involves using a peak-to-clutter ratio (PCR) to determine whether the boat in the input scene is authorized or unauthorized. Simulation results are presented to show that the proposed technique can successfully determine the presence of an authorized boat and identify any intruding boat present in the given input scene.Keywords: Adaptive progressive thresholding, fringe adjusted filters, image segmentation, joint transform correlation, synthetic discriminant function
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1208209 Six Sigma-Based Optimization of Shrinkage Accuracy in Injection Molding Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using six sigma methodologies to reach the desired shrinkage of a manufactured high-density polyurethane (HDPE) part produced by the injection molding machine. It presents a case study where the correct shrinkage is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for an injection molding process. To improve this process and keep the product within specifications, the six sigma methodology, design, measure, analyze, improve, and control (DMAIC) approach, was implemented in this study. The six sigma approach was paired with the Taguchi methodology to identify the optimized processing parameters that keep the shrinkage rate within the specifications by our customer. An L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of the cooling time, melt temperature, holding time, and metering stroke. The noise factor is the difference between material brand 1 and material brand 2. After the confirmation run was completed, measurements verify that the new parameter settings are optimal. With the new settings, the process capability index has improved dramatically. The purpose of this study is to show that the six sigma and Taguchi methodology can be efficiently used to determine important factors that will improve the process capability index of the injection molding process.
Keywords: Injection molding, shrinkage, six sigma, Taguchi parameter design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1381208 Continuous Measurement of Spatial Exposure Based on Visual Perception in Three-Dimensional Space
Authors: Nanjiang Chen
Abstract:
In the backdrop of expanding urban landscapes, accurately assessing spatial openness is critical. Traditional visibility analysis methods grapple with discretization errors and inefficiencies, creating a gap in truly capturing the human experience of space. Addressing these gaps, this paper presents a continuous visibility algorithm, providing a potentially valuable approach to measuring urban spaces from a human - centric perspective. This study presents a methodological breakthrough by applying this algorithm to urban visibility analysis. Unlike conventional approaches, this technique allows for a continuous range of visibility assessment, closely mirroring human visual perception. By eliminating the need for predefined subdivisions in ray casting, it offers a more accurate and efficient tool for urban planners and architects. The proposed algorithm not only reduces computational errors but also demonstrates faster processing capabilities, validated through a case study in Beijing's urban setting. Its key distinction lies in its potential to benefit a broad spectrum of stakeholders, ranging from urban developers to public policymakers, aiding in the creation of urban spaces that prioritize visual openness and quality of life. This advancement in urban analysis methods could lead to more inclusive, comfortable, and well-integrated urban environments, enhancing the spatial experience for communities worldwide.
Keywords: Visual openness, spatial continuity, ray-tracing algorithms, urban computation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30