Search results for: Homotopy Perturbation Method
11497 Automatic Algorithm for Processing and Analysis of Images from the Comet Assay
Authors: Yeimy L. Quintana, Juan G. Zuluaga, Sandra S. Arango
Abstract:
The comet assay is a method based on electrophoresis that is used to measure DNA damage in cells and has shown important results in the identification of substances with a potential risk to the human population as innumerable physical, chemical and biological agents. With this technique is possible to obtain images like a comet, in which the tail of these refers to damaged fragments of the DNA. One of the main problems is that the image has unequal luminosity caused by the fluorescence microscope and requires different processing to condition it as well as to know how many optimal comets there are per sample and finally to perform the measurements and determine the percentage of DNA damage. In this paper, we propose the design and implementation of software using Image Processing Toolbox-MATLAB that allows the automation of image processing. The software chooses the optimum comets and measuring the necessary parameters to detect the damage.Keywords: artificial vision, comet assay, DNA damage, image processing
Procedia PDF Downloads 31211496 Kannudi- A Reference Editor for Kannada (Based on OPOK! and OHOK! Principles, and Domain Knowledge)
Authors: Vishweshwar V. Dixit
Abstract:
Kannudi is a reference editor introducing a method of input for Kannada, called OHOK!, that is, Ottu Hāku Ottu Koḍu!. This is especially suited for pressure-sensitive input devices, though the current online implementation uses the regular mechanical keyboard. OHOK! has three possible modes, namely, sva-ottu (self-conjunct), kandante (as you see), and andante (as you say). It may be noted that kandante mode does not follow the phonetic order. However, this model may work well for those who are inclined to visualize as they type rather than vocalize the sounds. Kannudi also demonstrates how domain knowledge can be effectively used to potentially increase speed, accuracy, and user-friendliness. For example, selection of a default vowel, automatic shunyification, and arkification. Also implemented are four types of Deletes that are necessary for phono-syllabic languages like Kannada.Keywords: kannada, conjunct, reference editor, pressure input
Procedia PDF Downloads 9511495 Raman Spectroscopy of Carbon Nanostructures in Strong Magnetic Field
Authors: M. Kalbac, T. Verhagen, K. Drogowska, J. Vejpravova
Abstract:
One- and two-dimensional carbon nano structures with sp2 hybridization of carbon atoms (single walled carbon nano tubes and graphene) are promising materials in future electronic and spintronics devices due to specific character of their electronic structure. In this paper, we present a comparative study of graphene and single-wall carbon nano tubes by Raman spectro-microscopy in strong magnetic field. This unique method allows to study changes in electronic band structure of the two types of carbon nano structures induced by a strong magnetic field.Keywords: carbon nano structures, magnetic field, raman spectroscopy, spectro-microscopy
Procedia PDF Downloads 27311494 The Influence of Residual Stress on Hardness and Microstructure in Railway Rails
Authors: Muhammet Emre Turan, Sait Özçelik, Yavuz Sun
Abstract:
In railway rails, residual stress was measured and the values of residual stress were associated with hardness and micro structure in this study. At first, three rails as one meter long were taken and residual stresses were measured by cutting method according to the EN 13674-1 standardization. In this study, strain gauge that is an electrical apparatus was used. During the cutting, change in resistance in rail gave us residual stress value via computer program. After residual stress measurement, Brinell hardness distribution were performed for head parts of rails. Thus, the relationship between residual stress and hardness were established. In addition to that, micro structure analysis was carried out by optical microscope. The results show that, the micro structure and hardness value was changed with residual stress.Keywords: residual stress, hardness, micro structure, rail, strain gauge
Procedia PDF Downloads 60311493 The Environmental Impact of Wireless Technologies in Nigeria: An Overview of the IoT and 5G Network
Authors: Powei Happiness Kerry
Abstract:
Introducing wireless technologies in Nigeria have improved the quality of lives of Nigerians, however, not everyone sees it in that light. The paper on the environmental impact of wireless technologies in Nigeria summarizes the scholarly views on the impact of wireless technologies on the environment, beaming its searchlight on 5G and internet of things in Nigeria while also exploring the theory of the Technology Acceptance Model (TAM). The study used a qualitative research method to gather important data from relevant sources and contextually draws inference from the derived data. The study concludes that the Federal Government of Nigeria, before agreeing to any latest development in the world of wireless technologies, should weigh the implications and deliberate extensively with all stalk holders putting into consideration the confirmation it will receive from the National Assembly.Keywords: Internet of Things, radiofrequency, electromagnetic radiation, information and communications technology, ICT, 5G
Procedia PDF Downloads 13511492 Tool for Maxillary Sinus Quantification in Computed Tomography Exams
Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina
Abstract:
The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.Keywords: maxillary sinus, support vector machine, region growing, volume quantification
Procedia PDF Downloads 50411491 On the Effect of Immigration on Destination: Country Corruption
Authors: Eugen Dimant, Tim Krieger, Margarete Redlin
Abstract:
This paper analyzes the impact of migration on destination-country corruption levels. Capitalizing on a comprehensive dataset consisting of annual immigration stocks of OECD coun-tries from 207 countries of origin for the period 1984-2008, we explore different channels through which corruption might migrate. We employ different estimation methods using fixed effects and Tobit regressions in order to validate our findings. What is more, we also address the issue of endogeneity by using the Difference-Generalized Method of Moments (GMM) estimator. Independent of the econometric methodology we consistently find that while general migration has an insignificant effect on the destination country’s corruption level, immigration from corruption-ridden origin countries boosts corruption in the destination country. Our findings provide a more profound understanding of the economic implications associated with migration flows.Keywords: corruption, migration, impact of migration, destination-country corruption
Procedia PDF Downloads 32711490 A Mixed Method Approach for Modeling Entry Capacity at Rotary Intersections
Authors: Antonio Pratelli, Lorenzo Brocchini, Reginald Roy Souleyrette
Abstract:
A rotary is a traffic circle intersection where vehicles entering from branches give priority to circulating flow. Vehicles entering the intersection from converging roads move around the central island and weave out of the circle into their desired exiting branch. This creates merging and diverging conflicts among any entry and its successive exit, i.e., a section. Therefore, rotary capacity models are usually based on the weaving of the different movements in any section of the circle, and the maximum rate of flow value is then related to each weaving section of the rotary. Nevertheless, the single-section capacity value does not lead to the typical performance characteristics of the intersection, such as the entry average delay which is directly linked to its level of service. From another point of view, modern roundabout capacity models are based on the limitation of the flow entering from the single entrance due to the amount of flow circulating in front of the entrance itself. Modern roundabouts capacity models generally lead also to a performance evaluation. This paper aims to incorporate a modern roundabout capacity model into an old rotary capacity method to obtain from the latter the single input capacity and ultimately achieve the related performance indicators. Put simply; the main objective is to calculate the average delay of each single roundabout entrance to apply the most common Highway Capacity Manual, or HCM, criteria. The paper is organized as follows: firstly, the rotary and roundabout capacity models are sketched, and it has made a brief introduction to the model combination technique with some practical instances. The successive section is deserved to summarize the TRRL old rotary capacity model and the most recent HCM-7th modern roundabout capacity model. Then, the two models are combined through an iteration-based algorithm, especially set-up and linked to the concept of roundabout total capacity, i.e., the value reached due to a traffic flow pattern leading to the simultaneous congestion of all roundabout entrances. The solution is the average delay for each entrance of the rotary, by which is estimated its respective level of service. In view of further experimental applications, at this research stage, a collection of existing rotary intersections operating with the priority-to-circle rule has already started, both in the US and in Italy. The rotaries have been selected by direct inspection of aerial photos through a map viewer, namely Google Earth. Each instance has been recorded by location, general urban or rural, and its main geometrical patterns. Finally, conclusion remarks are drawn, and a discussion on some further research developments has opened.Keywords: mixed methods, old rotary and modern roundabout capacity models, total capacity algorithm, level of service estimation
Procedia PDF Downloads 9011489 The Application of Animal Welfare for Slaughterhouses in Bali Island
Authors: Budi B. Leksono, Mustopa
Abstract:
This study aims to determine the application of animal welfare at slaughterhouses in Bali island. The method used is purposive sampling. This study conducted by two slaughterhouses are in Denpasar districts and Badung districts in the Bali island. The result shows the percentage the application of animal welfare when the animal unloading the truck to shelter animal in the Denpasar slaughterhouse is 73.19%, whereas in Badung slaughterhouses are 63.04%. Percentage of the application of animal welfare when shelter animal to slaughter in the Denpasar slaughterhouses is 52.93%, whereas in Badung slaughterhouses are 75.96%. Based on these results, we can conclude that the slaughterhouses in the Bali island has been applying the principles of animal welfare, but needs to increase some aspects of animal welfare.Keywords: animal welfare, Bandung slaughterhouses, Bali Island, Denpasar slaughterhouses
Procedia PDF Downloads 26311488 Impact of Boundary Conditions on the Behavior of Thin-Walled Laminated Column with L-Profile under Uniform Shortening
Authors: Jaroslaw Gawryluk, Andrzej Teter
Abstract:
Simply supported angle columns subjected to uniform shortening are tested. The experimental studies are conducted on a testing machine using additional Aramis and the acoustic emission system. The laminate samples are subjected to axial uniform shortening. The tested columns are loaded with the force values from zero to the maximal load destroying the L-shaped column, which allowed one to observe the column post-buckling behavior until its collapse. Laboratory tests are performed at a constant velocity of the cross-bar equal to 1 mm/min. In order to eliminate stress concentrations between sample and support, flexible pads are used. Analyzed samples are made with carbon-epoxy laminate using the autoclave method. The configurations of laminate layers are: [60,0₂,-60₂,60₃,-60₂,0₃,-60₂,0,60₂]T, where direction 0 is along the length of the profile. Material parameters of laminate are: Young’s modulus along the fiber direction - 170GPa, Young’s modulus along the fiber transverse direction - 7.6GPa, shear modulus in-plane - 3.52GPa, Poisson’s ratio in-plane - 0.36. The dimensions of all columns are: length-300 mm, thickness-0.81mm, width of the flanges-40mm. Next, two numerical models of the column with and without flexible pads are developed using the finite element method in Abaqus software. The L-profile laminate column is modeled using the S8R shell elements. The layup-ply technique is used to define the sequence of the laminate layers. However, the model of grips is made of the R3D4 discrete rigid elements. The flexible pad is consists of the C3D20R type solid elements. In order to estimate the moment of the first laminate layer damage, the following initiation criteria were applied: maximum stress criterion, Tsai-Hill, Tsai-Wu, Azzi-Tsai-Hill, and Hashin criteria. The best compliance of results was observed for the Hashin criterion. It was found that the use of the pad in the numerical model significantly influences the damage mechanism. The model without pads characterized a much more stiffness, as evidenced by a greater bifurcation load and damage initiation load in all analyzed criteria, lower shortening, and less deflection of the column in its center than the model with flexible pads. Acknowledgment: The project/research was financed in the framework of the project Lublin University of Technology-Regional Excellence Initiative, funded by the Polish Ministry of Science and Higher Education (contract no. 030/RID/2018/19).Keywords: angle column, compression, experiment, FEM
Procedia PDF Downloads 20911487 Application of Semantic Technologies in Rapid Reconfiguration of Factory Systems
Authors: J. Zhang, K. Agyapong-Kodua
Abstract:
Digital factory based on visual design and simulation has emerged as a mainstream to reduce digital development life cycle. Some basic industrial systems are being integrated via semantic modelling, and products (P) matching process (P)-resource (R) requirements are designed to fulfill current customer demands. Nevertheless, product design is still limited to fixed product models and known knowledge of product engineers. Therefore, this paper presents a rapid reconfiguration method based on semantic technologies with PPR ontologies to reuse known and unknown knowledge. In order to avoid the influence of big data, our system uses a cloud manufactory and distributed database to improve the efficiency of querying meeting PPR requirements.Keywords: semantic technologies, factory system, digital factory, cloud manufactory
Procedia PDF Downloads 48911486 Geometrically Linear Symmetric Free Vibration Analysis of Sandwich Beam
Authors: Ibnorachid Zakaria, El Bikri Khalid, Benamar Rhali, Farah Abdoun
Abstract:
The aim of the present work is to study the linear free symmetric vibration of three-layer sandwich beam using the energy method. The zigzag model is used to describe the displacement field. The theoretical model is based on the top and bottom layers behave like Euler-Bernoulli beams while the core layer like a Timoshenko beam. Based on Hamilton’s principle, the governing equation of motion sandwich beam is obtained in order to calculate the linear frequency parameters for a clamped-clamped and simple supported-simple-supported beams. The effects of material properties and geometric parameters on the natural frequencies are also investigated.Keywords: linear vibration, sandwich, shear deformation, Timoshenko zig-zag model
Procedia PDF Downloads 47311485 Wind Speed Prediction Using Passive Aggregation Artificial Intelligence Model
Authors: Tarek Aboueldahab, Amin Mohamed Nassar
Abstract:
Wind energy is a fluctuating energy source unlike conventional power plants, thus, it is necessary to accurately predict short term wind speed to integrate wind energy in the electricity supply structure. To do so, we present a hybrid artificial intelligence model of short term wind speed prediction based on passive aggregation of the particle swarm optimization and neural networks. As a result, improvement of the prediction accuracy is obviously obtained compared to the standard artificial intelligence method.Keywords: artificial intelligence, neural networks, particle swarm optimization, passive aggregation, wind speed prediction
Procedia PDF Downloads 45311484 The Influence of Operational Changes on Efficiency and Sustainability of Manufacturing Firms
Authors: Dimitrios Kafetzopoulos
Abstract:
Nowadays, companies are more concerned with adopting their own strategies for increased efficiency and sustainability. Dynamic environments are fertile fields for developing operational changes. For this purpose, organizations need to implement an advanced management philosophy that boosts changes to companies’ operation. Changes refer to new applications of knowledge, ideas, methods, and skills that can generate unique capabilities and leverage an organization’s competitiveness. So, in order to survive and compete in the global and niche markets, companies should incorporate the adoption of operational changes into their strategy with regard to their products and their processes. Creating the appropriate culture for changes in terms of products and processes helps companies to gain a sustainable competitive advantage in the market. Thus, the purpose of this study is to investigate the role of both incremental and radical changes into operations of a company, taking into consideration not only product changes but also process changes, and continues by measuring the impact of these two types of changes on business efficiency and sustainability of Greek manufacturing companies. The above discussion leads to the following hypotheses: H1: Radical operational changes have a positive impact on firm efficiency. H2: Incremental operational changes have a positive impact on firm efficiency. H3: Radical operational changes have a positive impact on firm sustainability. H4: Incremental operational changes have a positive impact on firm sustainability. In order to achieve the objectives of the present study, a research study was carried out in Greek manufacturing firms. A total of 380 valid questionnaires were received while a seven-point Likert scale was used to measure all the questionnaire items of the constructs (radical changes, incremental changes, efficiency and sustainability). The constructs of radical and incremental operational changes, each one as one variable, has been subdivided into product and process changes. Non-response bias, common method variance, multicollinearity, multivariate normal distribution and outliers have been checked. Moreover, the unidimensionality, reliability and validity of the latent factors were assessed. Exploratory Factor Analysis and Confirmatory Factor Analysis were applied to check the factorial structure of the constructs and the factor loadings of the items. In order to test the research hypotheses, the SEM technique was applied (maximum likelihood method). The goodness of fit of the basic structural model indicates an acceptable fit of the proposed model. According to the present study findings, radical operational changes and incremental operational changes significantly influence both efficiency and sustainability of Greek manufacturing firms. However, it is in the dimension of radical operational changes, meaning those in process and product, that the most significant contributors to firm efficiency are to be found, while its influence on sustainability is low albeit statistically significant. On the contrary, incremental operational changes influence sustainability more than firms’ efficiency. From the above, it is apparent that the embodiment of the concept of the changes into the products and processes operational practices of a firm has direct and positive consequences for what it achieves from efficiency and sustainability perspective.Keywords: incremental operational changes, radical operational changes, efficiency, sustainability
Procedia PDF Downloads 13711483 Some Results for F-Minimal Hypersurfaces in Manifolds with Density
Authors: M. Abdelmalek
Abstract:
In this work, we study the hypersurfaces of constant weighted mean curvature embedded in weighted manifolds. We give a condition about these hypersurfaces to be minimal. This condition is given by the ellipticity of the weighted Newton transformations. We especially prove that two compact hypersurfaces of constant weighted mean curvature embedded in space forms and with the intersection in at least a point of the boundary must be transverse. The method is based on the calculus of the matrix of the second fundamental form in a boundary point and then the matrix associated with the Newton transformations. By equality, we find the weighted elementary symmetric function on the boundary of the hypersurface. We give in the end some examples and applications. Especially in Euclidean space, we use the above result to prove the Alexandrov spherical caps conjecture for the weighted case.Keywords: weighted mean curvature, weighted manifolds, ellipticity, Newton transformations
Procedia PDF Downloads 9611482 Entropy Generation of Unsteady Reactive Hydromagnetic Generalized Couette Fluid Flow of a Two-Step Exothermic Chemical Reaction Through a Channel
Authors: Rasaq Kareem, Jacob Gbadeyan
Abstract:
In this study, analysis of the entropy generation of an unsteady reactive hydromagnetic generalized couette fluid flow of a two-step exothermic chemical reaction through a channel with isothermal wall temperature under the influence of different chemical kinetics namely: Sensitized, Arrhenius and Bimolecular kinetics was investigated. The modelled nonlinear dimensionless equations governing the fluid flow were simplified and solved using the combined Laplace Differential Transform Method (LDTM). The effects of fluid parameters associated with the problem on the fluid temperature, entropy generation rate and Bejan number were discussed and presented through graphs.Keywords: couette, entropy, exothermic, unsteady
Procedia PDF Downloads 51711481 Myanmar Consonants Recognition System Based on Lip Movements Using Active Contour Model
Authors: T. Thein, S. Kalyar Myo
Abstract:
Human uses visual information for understanding the speech contents in noisy conditions or in situations where the audio signal is not available. The primary advantage of visual information is that it is not affected by the acoustic noise and cross talk among speakers. Using visual information from the lip movements can improve the accuracy and robustness of automatic speech recognition. However, a major challenge with most automatic lip reading system is to find a robust and efficient method for extracting the linguistically relevant speech information from a lip image sequence. This is a difficult task due to variation caused by different speakers, illumination, camera setting and the inherent low luminance and chrominance contrast between lip and non-lip region. Several researchers have been developing methods to overcome these problems; the one is lip reading. Moreover, it is well known that visual information about speech through lip reading is very useful for human speech recognition system. Lip reading is the technique of a comprehensive understanding of underlying speech by processing on the movement of lips. Therefore, lip reading system is one of the different supportive technologies for hearing impaired or elderly people, and it is an active research area. The need for lip reading system is ever increasing for every language. This research aims to develop a visual teaching method system for the hearing impaired persons in Myanmar, how to pronounce words precisely by identifying the features of lip movement. The proposed research will work a lip reading system for Myanmar Consonants, one syllable consonants (င (Nga)၊ ည (Nya)၊ မ (Ma)၊ လ (La)၊ ၀ (Wa)၊ သ (Tha)၊ ဟ (Ha)၊ အ (Ah) ) and two syllable consonants ( က(Ka Gyi)၊ ခ (Kha Gway)၊ ဂ (Ga Nge)၊ ဃ (Ga Gyi)၊ စ (Sa Lone)၊ ဆ (Sa Lain)၊ ဇ (Za Gwe) ၊ ဒ (Da Dway)၊ ဏ (Na Gyi)၊ န (Na Nge)၊ ပ (Pa Saug)၊ ဘ (Ba Gone)၊ ရ (Ya Gaug)၊ ဠ (La Gyi) ). In the proposed system, there are three subsystems, the first one is the lip localization system, which localizes the lips in the digital inputs. The next one is the feature extraction system, which extracts features of lip movement suitable for visual speech recognition. And the final one is the classification system. In the proposed research, Two Dimensional Discrete Cosine Transform (2D-DCT) and Linear Discriminant Analysis (LDA) with Active Contour Model (ACM) will be used for lip movement features extraction. Support Vector Machine (SVM) classifier is used for finding class parameter and class number in training set and testing set. Then, experiments will be carried out for the recognition accuracy of Myanmar consonants using the only visual information on lip movements which are useful for visual speech of Myanmar languages. The result will show the effectiveness of the lip movement recognition for Myanmar Consonants. This system will help the hearing impaired persons to use as the language learning application. This system can also be useful for normal hearing persons in noisy environments or conditions where they can find out what was said by other people without hearing voice.Keywords: feature extraction, lip reading, lip localization, Active Contour Model (ACM), Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), Two Dimensional Discrete Cosine Transform (2D-DCT)
Procedia PDF Downloads 28611480 A Variable Structural Control for a Flexible Lamina
Authors: Xuezhang Hou
Abstract:
A control problem of a flexible Lamina formulated by partial differential equations with viscoelastic boundary conditions is studied in this paper. The problem is written in standard form of linear infinite dimensional system in an appropriate energy Hilbert space. The semigroup approach of linear operators is adopted in investigating wellposedness of the closed loop system. A variable structural control for the system is proposed, and meanwhile an equivalent control method is applied to the thin plate system. A significant result on control theory that the thin plate can be approximated by ideal sliding mode in any accuracy in terms of semigroup approach is obtained.Keywords: partial differential equations, flexible lamina, variable structural control, semigroup of linear operators
Procedia PDF Downloads 8811479 Effects of Visual Agnosia in Children’s Linguistic Abilities: Psychoneurolinguistic Approach
Authors: Sadeq Al Yaari, Ayman Al Yaari, Adham Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Sajedah Al Yaari
Abstract:
Objective: The aim of the study is to examine the relationship between visual agnosia and learning delay in Yemeni children. Method: A total of 80 subjects (experimental group= 60, 30 males and 30 females and control group= 20, 10 males and 10 females) in two institutions (old and new). The age of all subjects at hand ranges between 6- and 12 years old. Pre and post-tests were administered. Results: Outline results show severe effects on the performance of the children due to visual agnosia this effect was benign in the group that received the treatment, and this can be clearly seen in their results in the post-test compared to the other group that did not receive the treatment and outcomes in general can be better understood in light of the control group.Keywords: visual, agnosia, linguistics, abilities, effects, psychoneurolinguistics
Procedia PDF Downloads 4311478 One Dimensional Magneto-Plasmonic Structure Based On Metallic Nano-Grating
Authors: S. M. Hamidi, M. Zamani
Abstract:
Magneto-plasmonic (MP) structures have turned into essential tools for the amplification of magneto-optical (MO) responses via the combination of MO activity and surface Plasmon resonance (SPR). Both the plasmonic and the MO properties of the resulting MP structure become interrelated because the SPR of the metallic medium. This interconnection can be modified the wave vector of surface plasmon polariton (SPP) in MP multilayer [1] or enhanced the MO activity [2- 3] and also modified the sensor responses [4]. There are several types of MP structures which are studied to enhance MO response in miniaturized configuration. In this paper, we propose a new MP structure based on the nano-metal grating and we investigate the MO and optical properties of this new structure. Our new MP structure fabricate by DC magnetron sputtering method and our home made MO experimental setup use for characterization of the structure.Keywords: Magneto-plasmonic structures, magneto-optical effect, nano-garting
Procedia PDF Downloads 56511477 Perceptual Organization within Temporal Displacement
Authors: Michele Sinico
Abstract:
The psychological present has an actual extension. When a sequence of instantaneous stimuli falls in this short interval of time, observers perceive a compresence of events in succession and the temporal order depends on the qualitative relationships between the perceptual properties of the events. Two experiments were carried out to study the influence of perceptual grouping, with and without temporal displacement, on the duration of auditory sequences. The psychophysical method of adjustment was adopted. The first experiment investigated the effect of temporal displacement of a white noise on sequence duration. The second experiment investigated the effect of temporal displacement, along the pitch dimension, on temporal shortening of sequence. The results suggest that the temporal order of sounds, in the case of temporal displacement, is organized along the pitch dimension.Keywords: time perception, perceptual present, temporal displacement, Gestalt laws of perceptual organization
Procedia PDF Downloads 25211476 Numerical Investigation on Transient Heat Conduction through Brine-Spongy Ice
Authors: S. R. Dehghani, Y. S. Muzychka, G. F. Naterer
Abstract:
The ice accretion of salt water on cold substrates creates brine-spongy ice. This type of ice is a mixture of pure ice and liquid brine. A real case of creation of this type of ice is superstructure icing which occurs on marine vessels and offshore structures in cold and harsh conditions. Transient heat transfer through this medium causes phase changes between brine pockets and pure ice. Salt rejection during the process of transient heat conduction increases the salinity of brine pockets to reach a local equilibrium state. In this process the only effect of passing heat through the medium is not changing the sensible heat of the ice and brine pockets; latent heat plays an important role and affects the mechanism of heat transfer. In this study, a new analytical model for evaluating heat transfer through brine-spongy ice is suggested. This model considers heat transfer and partial solidification and melting together. Properties of brine-spongy ice are obtained using properties of liquid brine and pure ice. A numerical solution using Method of Lines discretizes the medium to reach a set of ordinary differential equations. Boundary conditions are chosen using one of the applicable cases of this type of ice; one side is considered as a thermally isolated surface, and the other side is assumed to be suddenly affected by a constant temperature boundary. All cases are evaluated in temperatures between -20 C and the freezing point of brine-spongy ice. Solutions are conducted using different salinities from 5 to 60 ppt. Time steps and space intervals are chosen properly to maintain the most stable and fast solution. Variation of temperature, volume fraction of brine and brine salinity versus time are the most important outputs of this study. Results show that transient heat conduction through brine-spongy ice can create a various range of salinity of brine pockets from the initial salinity to that of 180 ppt. The rate of variation of temperature is found to be slower for high salinity cases. The maximum rate of heat transfer occurs at the start of the simulation. This rate decreases as time passes. Brine pockets are smaller at portions closer to the colder side than that of the warmer side. A the start of the solution, the numerical solution tends to increase instabilities. This is because of sharp variation of temperature at the start of the process. Changing the intervals improves the unstable situation. The analytical model using a numerical scheme is capable of predicting thermal behavior of brine spongy ice. This model and numerical solutions are important for modeling the process of freezing of salt water and ice accretion on cold structures.Keywords: method of lines, brine-spongy ice, heat conduction, salt water
Procedia PDF Downloads 21811475 Classifying Blog Texts Based on the Psycholinguistic Features of the Texts
Authors: Hyung Jun Ahn
Abstract:
With the growing importance of social media, it is imperative to analyze it to understand the users. Users share useful information and their experience through social media, where much of what is shared is in the form of texts. This study focused on blogs and aimed to test whether the psycho-linguistic characteristics of blog texts vary with the subject or the type of experience of the texts. For this goal, blog texts about four different types of experience, Go, skiing, reading, and musical were collected through the search API of the Tistory blog service. The analysis of the texts showed that various psycholinguistic characteristics of the texts are different across the four categories of the texts. Moreover, the machine learning experiment using the characteristics for automatic text classification showed significant performance. Specifically, the ensemble method, based on functional tree and bagging appeared to be most effective in classification.Keywords: blog, social media, text analysis, psycholinguistics
Procedia PDF Downloads 28211474 Features of Rail Strength Analysis in Conditions of Increased Force Loading
Authors: G. Guramishvili, M. Moistsrapishvili, L. Andghuladze
Abstract:
In the article are considered the problems arising at increasing of transferring from rolling stock axles on rail loading from 210 KN up to 270 KN and is offered for rail strength analysis definition of rail force loading complex integral characteristic with taking into account all affecting force factors that is characterizing specific operation condition of rail structure and defines the working capability of structure. As result of analysis due mentioned method is obtained that in the conditions of 270 KN loading the rail meets the working assessment criteria of rail and rail structures: Strength, rail track stability, rail links stability and its transverse stability, traffic safety condition that is rather important for post-Soviet countries railways.Keywords: axial loading, rail force loading, rail structure, rail strength analysis, rail track stability
Procedia PDF Downloads 42811473 Thermal Method Production of the Hydroxyapatite from Bone By-Products from Meat Industry
Authors: Agnieszka Sobczak-Kupiec, Dagmara Malina, Klaudia Pluta, Wioletta Florkiewicz, Bozena Tyliszczak
Abstract:
Introduction: Request for compound of phosphorus grows continuously, thus, it is searched for alternative sources of this element. One of these sources could be by-products from meat industry which contain prominent quantity of phosphorus compounds. Hydroxyapatite, which is natural component of animal and human bones, is leading material applied in bone surgery and also in stomatology. This is material, which is biocompatible, bioactive and osteoinductive. Methodology: Hydroxyapatite preparation: As a raw material was applied deproteinized and defatted bone pulp called bone sludge, which was formed as waste in deproteinization process of bones, in which a protein hydrolysate was the main product. Hydroxyapatite was received in calcining process in chamber kiln with electric heating in air atmosphere in two stages. In the first stage, material was calcining in temperature 600°C within 3 hours. In the next stage unified material was calcining in three different temperatures (750°C, 850°C and 950°C) keeping material in maximum temperature within 3.0 hours. Bone sludge: Bone sludge was formed as waste in deproteinization process of bones, in which a protein hydrolysate was the main product. Pork bones coming from the partition of meat were used as a raw material for the production of the protein hydrolysate. After disintegration, a mixture of bone pulp and water with a small amount of lactic acid was boiled at temperature 130-135°C and under pressure4 bar. After 3-3.5 hours boiled-out bones were separated on a sieve, and the solution of protein-fat hydrolysate got into a decanter, where bone sludge was separated from it. Results of the study: The phase composition was analyzed by roentgenographic method. Hydroxyapatite was the only crystalline phase observed in all the calcining products. XRD investigation was shown that crystallization degree of hydroxyapatite was increased with calcining temperature. Conclusion: The researches were shown that phosphorus content is around 12%, whereas, calcium content amounts to 28% on average. The conducted researches on bone-waste calcining at the temperatures of 750-950°C confirmed that thermal utilization of deproteinized bone-waste was possible. X-ray investigations were confirmed that hydroxyapatite is the main component of calcining products, and also XRD investigation was shown that crystallization degree of hydroxyapatite was increased with calcining temperature. Contents of calcium and phosphorus were distinctly increased with calcining temperature, whereas contents of phosphorus soluble in acids were decreased. It could be connected with higher crystallization degree of material received in higher temperatures and its stable structure. Acknowledgements: “The authors would like to thank the The National Centre for Research and Development (Grant no: LIDER//037/481/L-5/13/NCBR/2014) for providing financial support to this project”.Keywords: bone by-products, bone sludge, calcination, hydroxyapatite
Procedia PDF Downloads 28911472 Hydrology and Hydraulics Analysis of Beko Abo Dam and Appurtenant Structre Design, Ethiopia
Authors: Azazhu Wassie
Abstract:
This study tried to evaluate the maximum design flood for appurtenance structure design using the given climatological and hydrological data analysis on the referenced study area. The maximum design flood is determined by using flood frequency analysis. Using this method, the peak discharge is 32,583.67 m3/s, but the data is transferred because the dam site is not on the gauged station. Then the peak discharge becomes 38,115 m3/s. The study was conducted in June 2023. This dam is built across a river to create a reservoir on its upstream side for impounding water. The water stored in the reservoir is used for various purposes, such as irrigation, hydropower, navigation, fishing, etc. The total average volume of annual runoff is estimated to be 115.1 billion m3. The total potential of the land for irrigation development can go beyond 3 million ha.Keywords: dam design, flow duration curve, peak flood, rainfall, reservoir capacity, risk and reliability
Procedia PDF Downloads 3011471 Evotrader: Bitcoin Trading Using Evolutionary Algorithms on Technical Analysis and Social Sentiment Data
Authors: Martin Pellon Consunji
Abstract:
Due to the rise in popularity of Bitcoin and other crypto assets as a store of wealth and speculative investment, there is an ever-growing demand for automated trading tools, such as bots, in order to gain an advantage over the market. Traditionally, trading in the stock market was done by professionals with years of training who understood patterns and exploited market opportunities in order to gain a profit. However, nowadays a larger portion of market participants are at minimum aided by market-data processing bots, which can generally generate more stable signals than the average human trader. The rise in trading bot usage can be accredited to the inherent advantages that bots have over humans in terms of processing large amounts of data, lack of emotions of fear or greed, and predicting market prices using past data and artificial intelligence, hence a growing number of approaches have been brought forward to tackle this task. However, the general limitation of these approaches can still be broken down to the fact that limited historical data doesn’t always determine the future, and that a lot of market participants are still human emotion-driven traders. Moreover, developing markets such as those of the cryptocurrency space have even less historical data to interpret than most other well-established markets. Due to this, some human traders have gone back to the tried-and-tested traditional technical analysis tools for exploiting market patterns and simplifying the broader spectrum of data that is involved in making market predictions. This paper proposes a method which uses neuro evolution techniques on both sentimental data and, the more traditionally human-consumed, technical analysis data in order to gain a more accurate forecast of future market behavior and account for the way both automated bots and human traders affect the market prices of Bitcoin and other cryptocurrencies. This study’s approach uses evolutionary algorithms to automatically develop increasingly improved populations of bots which, by using the latest inflows of market analysis and sentimental data, evolve to efficiently predict future market price movements. The effectiveness of the approach is validated by testing the system in a simulated historical trading scenario, a real Bitcoin market live trading scenario, and testing its robustness in other cryptocurrency and stock market scenarios. Experimental results during a 30-day period show that this method outperformed the buy and hold strategy by over 260% in terms of net profits, even when taking into consideration standard trading fees.Keywords: neuro-evolution, Bitcoin, trading bots, artificial neural networks, technical analysis, evolutionary algorithms
Procedia PDF Downloads 12411470 One Plus One is More than Two: Why Nurse Recruiters Need to Use Various Multivariate Techniques to Understand the Limitations of the Concept of Emotional Intelligence
Authors: Austyn Snowden
Abstract:
Aim: To examine the construct validity of the Trait Emotional Intelligence Questionnaire Short form. Background: Emotional intelligence involves the identification and regulation of our own emotions and the emotions of others. It is therefore a potentially useful construct in the investigation of recruitment and retention in nursing and many questionnaires have been constructed to measure it. Design: Secondary analysis of existing dataset of responses to TEIQue-SF using concurrent application of Rasch analysis and confirmatory factor analysis. Method: First year undergraduate nursing and computing students completed Trait Emotional Intelligence Questionnaire-Short Form. Responses were analysed by synthesising results of Rasch analysis and confirmatory factor analysis.Keywords: emotional intelligence, rasch analysis, factor analysis, nurse recruiters
Procedia PDF Downloads 46711469 State’s Responsibility of Space Debris
Authors: Athari Farhani
Abstract:
Abstract The existence of space debris is a direct implication of human activities in outer space. The amount of orbital debris resulting from human exploration and use of outer space has been steadily increasing in the history of human exploration and use of outer space, so that space debris in the responsibility of the launching state. Space debris not only hs a direct impact on environmentalpollution but can also harm and endanger the safety of human life. Despite the legal provisions governing the exploration and use of outer space, both international space law and liability convention, however, these legal provisions are only basic prinsiples, so that further thought or effort are needed, such as new international legal instruments to regulate the existence of space debris. The method used in this research is normative juridical with an approach to written legal regulation, especially international agreements related to space law.Keywords: state’s responsibility, space debris, outerspace, international law
Procedia PDF Downloads 10711468 A Methodology Based on Image Processing and Deep Learning for Automatic Characterization of Graphene Oxide
Authors: Rafael do Amaral Teodoro, Leandro Augusto da Silva
Abstract:
Originated from graphite, graphene is a two-dimensional (2D) material that promises to revolutionize technology in many different areas, such as energy, telecommunications, civil construction, aviation, textile, and medicine. This is possible because its structure, formed by carbon bonds, provides desirable optical, thermal, and mechanical characteristics that are interesting to multiple areas of the market. Thus, several research and development centers are studying different manufacturing methods and material applications of graphene, which are often compromised by the scarcity of more agile and accurate methodologies to characterize the material – that is to determine its composition, shape, size, and the number of layers and crystals. To engage in this search, this study proposes a computational methodology that applies deep learning to identify graphene oxide crystals in order to characterize samples by crystal sizes. To achieve this, a fully convolutional neural network called U-net has been trained to segment SEM graphene oxide images. The segmentation generated by the U-net is fine-tuned with a standard deviation technique by classes, which allows crystals to be distinguished with different labels through an object delimitation algorithm. As a next step, the characteristics of the position, area, perimeter, and lateral measures of each detected crystal are extracted from the images. This information generates a database with the dimensions of the crystals that compose the samples. Finally, graphs are automatically created showing the frequency distributions by area size and perimeter of the crystals. This methodological process resulted in a high capacity of segmentation of graphene oxide crystals, presenting accuracy and F-score equal to 95% and 94%, respectively, over the test set. Such performance demonstrates a high generalization capacity of the method in crystal segmentation, since its performance considers significant changes in image extraction quality. The measurement of non-overlapping crystals presented an average error of 6% for the different measurement metrics, thus suggesting that the model provides a high-performance measurement for non-overlapping segmentations. For overlapping crystals, however, a limitation of the model was identified. To overcome this limitation, it is important to ensure that the samples to be analyzed are properly prepared. This will minimize crystal overlap in the SEM image acquisition and guarantee a lower error in the measurements without greater efforts for data handling. All in all, the method developed is a time optimizer with a high measurement value, considering that it is capable of measuring hundreds of graphene oxide crystals in seconds, saving weeks of manual work.Keywords: characterization, graphene oxide, nanomaterials, U-net, deep learning
Procedia PDF Downloads 161