Search results for: deceptive features
2933 Recognition of Tifinagh Characters with Missing Parts Using Neural Network
Authors: El Mahdi Barrah, Said Safi, Abdessamad Malaoui
Abstract:
In this paper, we present an algorithm for reconstruction from incomplete 2D scans for tifinagh characters. This algorithm is based on using correlation between the lost block and its neighbors. This system proposed contains three main parts: pre-processing, features extraction and recognition. In the first step, we construct a database of tifinagh characters. In the second step, we will apply “shape analysis algorithm”. In classification part, we will use Neural Network. The simulation results demonstrate that the proposed method give good results.Keywords: Tifinagh character recognition, neural networks, local cost computation, ANN
Procedia PDF Downloads 3342932 About the Number of Fundamental Physical Interactions
Authors: Andrey Angorsky
Abstract:
In the article an issue about the possible number of fundamental physical interactions is studied. The theory of similarity on the dimensionless quantity as the damping ratio serves as the instrument of analysis. The structure with the features of Higgs field comes out from non-commutative expression for this ratio. The experimentally checked up supposition about the nature of dark energy is spoken out.Keywords: damping ratio, dark energy, dimensionless quantity, fundamental physical interactions, Higgs field, non-commutative expression
Procedia PDF Downloads 1402931 Google Translate: AI Application
Authors: Shaima Almalhan, Lubna Shukri, Miriam Talal, Safaa Teskieh
Abstract:
Since artificial intelligence is a rapidly evolving topic that has had a significant impact on technical growth and innovation, this paper examines people's awareness, use, and engagement with the Google Translate application. To see how familiar aware users are with the app and its features, quantitative and qualitative research was conducted. The findings revealed that consumers have a high level of confidence in the application and how far people they benefit from this sort of innovation and how convenient it makes communication.Keywords: artificial intelligence, google translate, speech recognition, language translation, camera translation, speech to text, text to speech
Procedia PDF Downloads 1542930 Design of Broadband Power Divider for 3G and 4G Applications
Authors: A. M. El-Akhdar, A. M. El-Tager, H. M. El-Hennawy
Abstract:
This paper presents a broadband power divider with equal power division ratio. Two sections of transmission line transformers based on coupled microstrip lines are applied to obtain broadband performance. In addition, design methodology is proposed for the novel structure. A prototype is designed, simulated to operate in the band from 2.1 to 3.8 GHz to fulfill the requirements of 3G and 4G applications. The proposed structure features reduced size and less resistors than other conventional techniques. Simulation verifies the proposed idea and design methodology.Keywords: power dividers, coupled lines, microstrip, 4G applications
Procedia PDF Downloads 4772929 A Semantic and Concise Structure to Represent Human Actions
Authors: Tobias Strübing, Fatemeh Ziaeetabar
Abstract:
Humans usually manipulate objects with their hands. To represent these actions in a simple and understandable way, we need to use a semantic framework. For this purpose, the Semantic Event Chain (SEC) method has already been presented which is done by consideration of touching and non-touching relations between manipulated objects in a scene. This method was improved by a computational model, the so-called enriched Semantic Event Chain (eSEC), which incorporates the information of static (e.g. top, bottom) and dynamic spatial relations (e.g. moving apart, getting closer) between objects in an action scene. This leads to a better action prediction as well as the ability to distinguish between more actions. Each eSEC manipulation descriptor is a huge matrix with thirty rows and a massive set of the spatial relations between each pair of manipulated objects. The current eSEC framework has so far only been used in the category of manipulation actions, which eventually involve two hands. Here, we would like to extend this approach to a whole body action descriptor and make a conjoint activity representation structure. For this purpose, we need to do a statistical analysis to modify the current eSEC by summarizing while preserving its features, and introduce a new version called Enhanced eSEC or (e2SEC). This summarization can be done from two points of the view: 1) reducing the number of rows in an eSEC matrix, 2) shrinking the set of possible semantic spatial relations. To achieve these, we computed the importance of each matrix row in an statistical way, to see if it is possible to remove a particular one while all manipulations are still distinguishable from each other. On the other hand, we examined which semantic spatial relations can be merged without compromising the unity of the predefined manipulation actions. Therefore by performing the above analyses, we made the new e2SEC framework which has 20% fewer rows, 16.7% less static spatial and 11.1% less dynamic spatial relations. This simplification, while preserving the salient features of a semantic structure in representing actions, has a tremendous impact on the recognition and prediction of complex actions, as well as the interactions between humans and robots. It also creates a comprehensive platform to integrate with the body limbs descriptors and dramatically increases system performance, especially in complex real time applications such as human-robot interaction prediction.Keywords: enriched semantic event chain, semantic action representation, spatial relations, statistical analysis
Procedia PDF Downloads 1262928 Artificial Intelligence and Development: The Missing Link
Authors: Driss Kettani
Abstract:
ICT4D actors are naturally attempted to include AI in the range of enabling technologies and tools that could support and boost the Development process, and to refer to these as AI4D. But, doing so, assumes that AI complies with the very specific features of ICT4D context, including, among others, affordability, relevance, openness, and ownership. Clearly, none of these is fulfilled, and the enthusiastic posture that AI4D is a natural part of ICT4D is not grounded and, to certain extent, does not serve the purpose of Technology for Development at all. In the context of Development, it is important to emphasize and prioritize ICT4D, in the national digital transformation strategies, instead of borrowing "trendy" waves of the IT Industry that are motivated by business considerations, with no specific care/consideration to Development.Keywords: AI, ICT4D, technology for development, position paper
Procedia PDF Downloads 882927 NanoFrazor Lithography for advanced 2D and 3D Nanodevices
Authors: Zhengming Wu
Abstract:
NanoFrazor lithography systems were developed as a first true alternative or extension to standard mask-less nanolithography methods like electron beam lithography (EBL). In contrast to EBL they are based on thermal scanning probe lithography (t-SPL). Here a heatable ultra-sharp probe tip with an apex of a few nm is used for patterning and simultaneously inspecting complex nanostructures. The heat impact from the probe on a thermal responsive resist generates those high-resolution nanostructures. The patterning depth of each individual pixel can be controlled with better than 1 nm precision using an integrated in-situ metrology method. Furthermore, the inherent imaging capability of the Nanofrazor technology allows for markerless overlay, which has been achieved with sub-5 nm accuracy as well as it supports stitching layout sections together with < 10 nm error. Pattern transfer from such resist features below 10 nm resolution were demonstrated. The technology has proven its value as an enabler of new kinds of ultra-high resolution nanodevices as well as for improving the performance of existing device concepts. The application range for this new nanolithography technique is very broad spanning from ultra-high resolution 2D and 3D patterning to chemical and physical modification of matter at the nanoscale. Nanometer-precise markerless overlay and non-invasiveness to sensitive materials are among the key strengths of the technology. However, while patterning at below 10 nm resolution is achieved, significantly increasing the patterning speed at the expense of resolution is not feasible by using the heated tip alone. Towards this end, an integrated laser write head for direct laser sublimation (DLS) of the thermal resist has been introduced for significantly faster patterning of micrometer to millimeter-scale features. Remarkably, the areas patterned by the tip and the laser are seamlessly stitched together and both processes work on the very same resist material enabling a true mix-and-match process with no developing or any other processing steps in between. The presentation will include examples for (i) high-quality metal contacting of 2D materials, (ii) tuning photonic molecules, (iii) generating nanofluidic devices and (iv) generating spintronic circuits. Some of these applications have been enabled only due to the various unique capabilities of NanoFrazor lithography like the absence of damage from a charged particle beam.Keywords: nanofabrication, grayscale lithography, 2D materials device, nano-optics, photonics, spintronic circuits
Procedia PDF Downloads 722926 Estimating Algae Concentration Based on Deep Learning from Satellite Observation in Korea
Authors: Heewon Jeong, Seongpyo Kim, Joon Ha Kim
Abstract:
Over the last few tens of years, the coastal regions of Korea have experienced red tide algal blooms, which are harmful and toxic to both humans and marine organisms due to their potential threat. It was accelerated owing to eutrophication by human activities, certain oceanic processes, and climate change. Previous studies have tried to monitoring and predicting the algae concentration of the ocean with the bio-optical algorithms applied to color images of the satellite. However, the accurate estimation of algal blooms remains problems to challenges because of the complexity of coastal waters. Therefore, this study suggests a new method to identify the concentration of red tide algal bloom from images of geostationary ocean color imager (GOCI) which are representing the water environment of the sea in Korea. The method employed GOCI images, which took the water leaving radiances centered at 443nm, 490nm and 660nm respectively, as well as observed weather data (i.e., humidity, temperature and atmospheric pressure) for the database to apply optical characteristics of algae and train deep learning algorithm. Convolution neural network (CNN) was used to extract the significant features from the images. And then artificial neural network (ANN) was used to estimate the concentration of algae from the extracted features. For training of the deep learning model, backpropagation learning strategy is developed. The established methods were tested and compared with the performances of GOCI data processing system (GDPS), which is based on standard image processing algorithms and optical algorithms. The model had better performance to estimate algae concentration than the GDPS which is impossible to estimate greater than 5mg/m³. Thus, deep learning model trained successfully to assess algae concentration in spite of the complexity of water environment. Furthermore, the results of this system and methodology can be used to improve the performances of remote sensing. Acknowledgement: This work was supported by the 'Climate Technology Development and Application' research project (#K07731) through a grant provided by GIST in 2017.Keywords: deep learning, algae concentration, remote sensing, satellite
Procedia PDF Downloads 1832925 Feature Selection Approach for the Classification of Hydraulic Leakages in Hydraulic Final Inspection using Machine Learning
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
Manufacturing companies are facing global competition and enormous cost pressure. The use of machine learning applications can help reduce production costs and create added value. Predictive quality enables the securing of product quality through data-supported predictions using machine learning models as a basis for decisions on test results. Furthermore, machine learning methods are able to process large amounts of data, deal with unfavourable row-column ratios and detect dependencies between the covariates and the given target as well as assess the multidimensional influence of all input variables on the target. Real production data are often subject to highly fluctuating boundary conditions and unbalanced data sets. Changes in production data manifest themselves in trends, systematic shifts, and seasonal effects. Thus, Machine learning applications require intensive pre-processing and feature selection. Data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets. Within the used real data set of Bosch hydraulic valves, the comparability of the same production conditions in the production of hydraulic valves within certain time periods can be identified by applying the concept drift method. Furthermore, a classification model is developed to evaluate the feature importance in different subsets within the identified time periods. By selecting comparable and stable features, the number of features used can be significantly reduced without a strong decrease in predictive power. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. In this research, the ada boosting classifier is used to predict the leakage of hydraulic valves based on geometric gauge blocks from machining, mating data from the assembly, and hydraulic measurement data from end-of-line testing. In addition, the most suitable methods are selected and accurate quality predictions are achieved.Keywords: classification, achine learning, predictive quality, feature selection
Procedia PDF Downloads 1622924 Great Art for Little Children - Games in School Education as Integration of Polish-Language, Eurhythmics, Artistic and Mathematical Subject Matter
Authors: Małgorzata Anna Karczmarzyk
Abstract:
Who is the contemporary child? What are his/her distinctive features making him/her different from earlier generations? And how to teach in the dissimilar social reality? These questions will constitute the key to my reflections on contemporary early school education. For, to my mind, games have become highly significant for the modern model of education. There arise publications and research employing games to increase competence both in business, tutoring, or coaching, as well as in academic education . Thanks to games students and subordinates can be taught such abilities as problem thinking, creativity, consistent fulfillment of goals, resourcefulness and skills of communication.Keywords: games, art, children, school education, integration
Procedia PDF Downloads 8552923 Brainwave Classification for Brain Balancing Index (BBI) via 3D EEG Model Using k-NN Technique
Authors: N. Fuad, M. N. Taib, R. Jailani, M. E. Marwan
Abstract:
In this paper, the comparison between k-Nearest Neighbor (kNN) algorithms for classifying the 3D EEG model in brain balancing is presented. The EEG signal recording was conducted on 51 healthy subjects. Development of 3D EEG models involves pre-processing of raw EEG signals and construction of spectrogram images. Then, maximum PSD values were extracted as features from the model. There are three indexes for the balanced brain; index 3, index 4 and index 5. There are significant different of the EEG signals due to the brain balancing index (BBI). Alpha-α (8–13 Hz) and beta-β (13–30 Hz) were used as input signals for the classification model. The k-NN classification result is 88.46% accuracy. These results proved that k-NN can be used in order to predict the brain balancing application.Keywords: power spectral density, 3D EEG model, brain balancing, kNN
Procedia PDF Downloads 4862922 Effects of Cellular Insulin Receptor Stimulators with Alkaline Water on Performance, Plasma Cholesterol, Glucose, Triglyceride Levels and Hatchability in Breeding Japanese Quail
Authors: Rabia Göçmen, Gülşah Kanbur, Sinan Sefa Parlat
Abstract:
Aim of this study is to determine the effects of cellular insulin receptor stimulators on performance, plasma glucose, high density lipoprotein (HDL), low density lipoprotein (LDL), total cholesterol, triglyceride, triiodothyronine (T3) and thyroxine (T4) hormone levels, and incubation features in the breeding Japanese quails (Coturnix japonica). In the study, a total of 84 breeding quails was used, 6 weeks’ age, 24 are male and 60, female. Rations used in experiment are 2900 kcal/kg metabolic energy and 20% crude protein. Water pH is calibrated to 7.45. Ration and water were administered ad-libitum to the animals. As metformin source, metformin-HCl was used and as chrome resource, chromium picolinate was used. Trial groups were formed as control group (basal ration), metformin group (basal ration, added metformin at the level of feed of 20 mg/kg), and chromium picolinate (basal ration, added feed of 1500 ppb Cr) group. When regarded to the results of performance at the end of experiment, it is seen that live weight gain, feed consumption, egg weight, feed conversion ratio (Feed consumption/ egg weight), and egg production were affected at the significant level (p < 0.05). When the results are evaluated in terms of incubation features, hatchability and hatchability of fertile egg ratio were not affected from the treatments. Fertility ratio was significantly affected by metformin and chromium picolinate treatments and fertility rose at the significant level compared to control group (p < 0.05). According to results of experiment, plasma glucose level was not affected by metformin and chromium picolinate treatments. Plasma, total cholesterol, HDL, LDL, and triglyceride levels were significantly affected from insulin receptor stimulators added to ration (p < 0.05). Hormone level of Plasma T3 and T4 were also affected at the significant level from insulin receptor stimulators added to ration (p < 0.05).Keywords: chromium picolinate, cholesterol, hormone, metformin, quail
Procedia PDF Downloads 2202921 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis
Authors: H. Jung, N. Kim, B. Kang, J. Choe
Abstract:
History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.Keywords: history matching, principal component analysis, reservoir modelling, support vector machine
Procedia PDF Downloads 1602920 Automatic Checkpoint System Using Face and Card Information
Authors: Kriddikorn Kaewwongsri, Nikom Suvonvorn
Abstract:
In the deep south of Thailand, checkpoints for people verification are necessary for the security management of risk zones, such as official buildings in the conflict area. In this paper, we propose an automatic checkpoint system that verifies persons using information from ID cards and facial features. The methods for a person’s information abstraction and verification are introduced based on useful information such as ID number and name, extracted from official cards, and facial images from videos. The proposed system shows promising results and has a real impact on the local society.Keywords: face comparison, card recognition, OCR, checkpoint system, authentication
Procedia PDF Downloads 3212919 The Role of Strategic Metals in Cr-Al-Pt-V Composition of Protective Bond Coats
Authors: A. M. Pashayev, A. S. Samedov, T. B. Usubaliyev, N. Sh. Yusifov
Abstract:
Different types of coating technologies are widely used for gas turbine blades. Thermal barrier coatings, consisting of ceramic top coat, thermally grown oxide and a metallic bond coat are used in applications for thermal protection of hot section components in gas turbine engines. Operational characteristics and longevity of high-temperature turbine blades substantially depend on a right choice of composition of the protective thermal barrier coatings. At a choice of composition of a coating and content of the basic elements it is necessary to consider following factors, as minimum distinctions of coefficients of thermal expansions of elements, level of working temperatures and composition of the oxidizing environment, defining the conditions for the formation of protective layers, intensity of diffusive processes and degradation speed of protective properties of elements, extent of influence on the fatigue durability of details during operation, using of elements with high characteristics of thermal stability and satisfactory resilience of gas corrosion, density, hardness, thermal conduction and other physical characteristics. Forecasting and a choice of a thermal barrier coating composition, all above factors at the same time cannot be considered, as some of these characteristics are defined by experimental studies. The implemented studies and investigations show that one of the main failures of coatings used on gas turbine blades is related to not fully taking the physical-chemical features of elements into consideration during the determination of the composition of alloys. It leads to the formation of more difficult spatial structure, composition which also changes chaotically in some interval of concentration that doesn't promote thermal and structural firmness of a coating. For the purpose of increasing the thermal and structural resistant of gas turbine blade coatings is offered a new approach to forecasting of composition on the basis of analysis of physical-chemical characteristics of alloys taking into account the size factor, electron configuration, type of crystal lattices and Darken-Gurry method. As a result, of calculations and experimental investigations is offered the new four-component metallic bond coat on the basis of chrome for the gas turbine blades.Keywords: gas turbine blades, thermal barrier coating, metallic bond coat, strategic metals, physical-chemical features
Procedia PDF Downloads 3152918 From the Local to the Global: New Terrorism
Authors: Shamila Ahmed
Abstract:
The paper examines how the fluidity between the local level and the global level is an intrinsic feature of new terrorism. Through using cosmopolitanism, the narratives of the two opposing sides of ISIS and the ‘war on terrorism’ response are explored. It is demonstrated how the fluidity between these levels facilitates the radicalisation process through exploring how groups such as ISIS highlight the perceived injustices against Muslims locally and globally and therefore exploit the globalisation process which has reduced the space between these levels. Similarly, it is argued that the ‘war on terror’ involves the intersection of fear, security, threat, risk and social control as features of both the international ‘war on terror’ and intra state policies.Keywords: terrorism, war on terror, cosmopolitanism, global level terrorism
Procedia PDF Downloads 5842917 A Resolution on Ideal University Teachers Perspective of Turkish Students
Authors: Metin Özkan
Abstract:
In the last decade, Turkish higher education has been expanded dramatically. With this expansion, Turkey has come a long way in establishing an efficient system of higher education which is moving into a ‘mass’ system with institutions spanning the whole country. This expansion as a quantitative target leads to questioning the quality of higher education services. Especially, the qualities of higher education services depend on mainly quality of educators. Qualities of educators are most important in Turkish higher education system due to rapid rise in the number of universities and students. Therefore, it is seen important that reveals the portrait of ideal university teacher from the point of view student enrolled in Turkish higher education system. The purpose of this current study is to determine the portrait of ideal university teacher according to the views of Turkish Students. This research is carried out with descriptive scanning method and combined and mixed of qualitative and quantitative methodologies. Research data of qualitative section were collected at Gaziantep University with the participation of 45 students enrolled in 15 different faculties. Quantitative section was performed on 217 students. The data were obtained through semi-structured interview and “Ideal University Teacher Assessment” form developed by the researcher. The interview form consists of basically two parts. The first part of the interview was about personal information, the second part included questions about the characteristic of ideal university teacher. The questions which constitute the second part of the interview are; "what is a good university teacher like?” and “What human qualities and professional skills should a university teacher have? ". Assessment form which was created from the qualitative data obtained from interviews was used to attain scaling values for pairwise comparison and ranking judgment. According to study results, it has been found that ideal university teacher characteristics include the features like patient, tolerant, comprehensive and tolerant. Ideal university teacher, besides, implement the teaching methods like encouraging the students’ critical thinking, accepting the students’ recommendations on how to conduct the lesson and making use of the new technologies etc. Motivating and respecting the students, adopting a participative style, adopting a sincere way of manner also constitute the ideal university features relationships with students.Keywords: faculty, higher education, ideal university teacher, teacher behavior
Procedia PDF Downloads 2082916 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments
Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic
Abstract:
Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder
Procedia PDF Downloads 2892915 New Approach in Sports Management of Great Sports Events
Authors: Taieb Kherafa Noureddine
Abstract:
The paper presents a new approach regarding the management in sports that is based on the principles of reengineering. Applying that modern and pure management system, called reengineering, in sports activity, we hope to get better and better results, in order to increase both the health state and the performances of trained athletes. The paper also presents the similarities between BPR (Business Process Reengineering) and sports managements, as well as the proposed solution for a proper implementation of such model of management. The five components of the basic BPR model are presented, together with their features for sports management.Keywords: business process reengineering, great sports events, sports management, training activities
Procedia PDF Downloads 4922914 Imaging Features of Hepatobiliary Histiocytosis
Authors: Ayda Youssef, Tarek Rafaat, Iman zaky
Abstract:
Purpose: Langerhans’ cell histiocytosis (LCH) is not uncommon pathology that implies aberrant proliferation of a specific dendritic (Langerhans) cell. These atypical but mature cells of monoclonal origin can infiltrate many sites of the body and may occur as localized lesions or as widespread systemic disease. Liver is one of the uncommon sites of affection. The twofold objective of this study is to illustrate the radiological presentation of this disease, and to compare these results with previously reported series. Methods and Materials: Between 2007 and 2012, 150 patients with biopsy-proven LCH were treated in our hospital, a paediatric cancer tertiary care center. A retrospective review of radiographic images and reports was performed. There were 33 patients with liver affection are stratified. All patients underwent imaging studies, mostly US and CT. A chart review was performed to obtain demographic, clinical and radiological data. They were analyzed and compared to other published series. Results: Retrospective assessment of 150 patients with LCH was performed, among them 33 patients were identified who had liver involvement. All these patients developed multisystemic disease; They were 12 females and 21 males with (n= 32), seven of them had marked hepatomegaly. Diffuse hypodense liver parenchyma was encountered in five cases, the periportal location has a certain predilection in cases of focal affection where three cases has a hypodense periportal soft tissue sheets, one of them associated with dilated biliary radicals, only one case has multiple focal lesions unrelated to portal tracts. On follow up of the patients, two cases show abnormal morphology of liver with bossy outline. Conclusion: LCH is a not infrequent disease. A high-index suspicion should be raised in the context of diagnosis of liver affection. A biopsy is recommended in the presence of radiological suspicion. Chemotherapy is the preferred therapeutic modality. Liver histiocytosis are not disease specific features but should be interpreted in conjunction with the clinical history and the results of biopsy. Clinical Relevance/Application: Radiologist should be aware of different patterns of hepatobiliary histiocytosis, Thus early diagnosis and proper management of patient can be conducted.Keywords: langerhans’ cell histiocytosis, liver, medical and health sciences, radiology
Procedia PDF Downloads 2822913 Management and Marketing Implications of Tourism Gravity Models
Authors: Clive L. Morley
Abstract:
Gravity models and panel data modelling of tourism flows are receiving renewed attention, after decades of general neglect. Such models have quite different underpinnings from conventional demand models derived from micro-economic theory. They operate at a different level of data and with different theoretical bases. These differences have important consequences for the interpretation of the results and their policy and managerial implications. This review compares and contrasts the two model forms, clarifying the distinguishing features and the estimation requirements of each. In general, gravity models are not recommended for use to address specific management and marketing purposes.Keywords: gravity models, micro-economics, demand models, marketing
Procedia PDF Downloads 4382912 Development of a New Device for Bending Fatigue Testing
Authors: B. Mokhtarnia, M. Layeghi
Abstract:
This work presented an original bending fatigue-testing setup for fatigue characterization of composite materials. A three-point quasi-static setup was introduced that was capable of applying stress control load in different loading waveforms, frequencies, and stress ratios. This setup was equipped with computerized measuring instruments to evaluate fatigue damage mechanisms. A detailed description of its different parts and working features was given, and dynamic analysis was done to verify the functional accuracy of the device. Feasibility was validated successfully by conducting experimental fatigue tests.Keywords: bending fatigue, quasi-static testing setup, experimental fatigue testing, composites
Procedia PDF Downloads 1322911 On Regional Climate Singularity: On Example of the Territory of Georgia
Authors: T. Davitashvili
Abstract:
In this paper, some results of numerical simulation of the air flow dynamics in the troposphere over the Caucasus Mountains taking place in conditions of nonstationarity of large-scale undisturbed background flow are presented. Main features of the atmospheric currents changeability while air masses are transferred from the Black Sea to the land’s surface had been investigated. In addition, the effects of thermal and advective-dynamic factors of atmosphere on the changes of the West Georgian climate have been studied. It was shown that non-proportional warming of the Black Sea and Colkhi lowland provokes the intensive strengthening of circulation and effect of climate cooling in the western Georgia.Keywords: regional climate, numerical simulation, local circulation, orographic effect
Procedia PDF Downloads 4822910 Predicting Success and Failure in Drug Development Using Text Analysis
Authors: Zhi Hao Chow, Cian Mulligan, Jack Walsh, Antonio Garzon Vico, Dimitar Krastev
Abstract:
Drug development is resource-intensive, time-consuming, and increasingly expensive with each developmental stage. The success rates of drug development are also relatively low, and the resources committed are wasted with each failed candidate. As such, a reliable method of predicting the success of drug development is in demand. The hypothesis was that some examples of failed drug candidates are pushed through developmental pipelines based on false confidence and may possess common linguistic features identifiable through sentiment analysis. Here, the concept of using text analysis to discover such features in research publications and investor reports as predictors of success was explored. R studios were used to perform text mining and lexicon-based sentiment analysis to identify affective phrases and determine their frequency in each document, then using SPSS to determine the relationship between our defined variables and the accuracy of predicting outcomes. A total of 161 publications were collected and categorised into 4 groups: (i) Cancer treatment, (ii) Neurodegenerative disease treatment, (iii) Vaccines, and (iv) Others (containing all other drugs that do not fit into the 3 categories). Text analysis was then performed on each document using 2 separate datasets (BING and AFINN) in R within the category of drugs to determine the frequency of positive or negative phrases in each document. A relative positivity and negativity value were then calculated by dividing the frequency of phrases with the word count of each document. Regression analysis was then performed with SPSS statistical software on each dataset (values from using BING or AFINN dataset during text analysis) using a random selection of 61 documents to construct a model. The remaining documents were then used to determine the predictive power of the models. Model constructed from BING predicts the outcome of drug performance in clinical trials with an overall percentage of 65.3%. AFINN model had a lower accuracy at predicting outcomes compared to the BING model at 62.5% but was not effective at predicting the failure of drugs in clinical trials. Overall, the study did not show significant efficacy of the model at predicting outcomes of drugs in development. Many improvements may need to be made to later iterations of the model to sufficiently increase the accuracy.Keywords: data analysis, drug development, sentiment analysis, text-mining
Procedia PDF Downloads 1572909 On the Weightlessness of Vowel Lengthening: Insights from Arabic Dialect of Yemen and Contribution to Psychoneurolinguistics
Authors: Sadeq Al Yaari, Muhammad Alkhunayn, Montaha Al Yaari, Ayman Al Yaari, Aayah Al Yaari, Adham Al Yaari, Sajedah Al Yaari, Fatehi Eissa
Abstract:
Introduction: It is well established that lengthening (longer duration) is considered one of the correlates of lexical and phrasal prominence. However, it is unexplored whether the scope of vowel lengthening in the Arabic dialect of Yemen (ADY) is differently affected by educated and/or uneducated speakers from different dialectal backgrounds. Specifically, the research aims to examine whether or not linguistic background acquired through different educational channels makes a difference in the speech of the speaker and how that is reflected in related psychoneurolinguistic impairments. Methods: For the above mentioned purpose, we conducted an articulatory experiment wherein a set of words from ADY were examined in the dialectal speech of thousand and seven hundred Yemeni educated and uneducated speakers aged 19-61 years growing up in five regions of the country: Northern, southern, eastern, western and central and were, accordingly, assigned into five dialectal groups. A seven-minute video clip was shown to the participants, who have been asked to spontaneously describe the scene they had just watched before the researchers linguistically and statistically analyzed recordings to weigh vowel lengthening in the speech of the participants. Results: The results show that vowels (monophthongs and diphthongs) are lengthened by all participants. Unexpectedly, educated and uneducated speakers from northern and central dialects lengthen vowels. Compared with uneducated speakers from the same dialect, educated speakers lengthen fewer vowels in their dialectal speech. Conclusions: These findings support the notion that extensive exposure to dialects on account of standard language can cause changes to the patterns of dialects themselves, and this can be seen in the speech of educated and uneducated speakers of these dialects. Further research is needed to clarify the phonemic distinctive features and frequency of lengthening in other open class systems (i.e., nouns, adjectives, and adverbs). Phonetic and phonological report measures are needed as well as validation of existing measures for assessing phonemic vowel length in the Arabic population in general and Arabic individuals with voice, speech, and language impairments in particular.Keywords: vowel lengthening, Arabic dialect of Yemen, phonetics, phonology, impairment, distinctive features
Procedia PDF Downloads 402908 Case-Based Reasoning Application to Predict Geological Features at Site C Dam Construction Project
Authors: Shahnam Behnam Malekzadeh, Ian Kerr, Tyson Kaempffer, Teague Harper, Andrew Watson
Abstract:
The Site C Hydroelectric dam is currently being constructed in north-eastern British Columbia on sub-horizontal sedimentary strata that dip approximately 15 meters from one bank of the Peace River to the other. More than 615 pressure sensors (Vibrating Wire Piezometers) have been installed on bedding planes (BPs) since construction began, with over 80 more planned before project completion. These pressure measurements are essential to monitor the stability of the rock foundation during and after construction and for dam safety purposes. BPs are identified by their clay gouge infilling, which varies in thickness from less than 1 to 20 mm and can be challenging to identify as the core drilling process often disturbs or washes away the gouge material. Without the use of depth predictions from nearby boreholes, stratigraphic markers, and downhole geophysical data, it is difficult to confidently identify BP targets for the sensors. In this paper, a Case-Based Reasoning (CBR) method was used to develop an empirical model called the Bedding Plane Elevation Prediction (BPEP) to help geologists and geotechnical engineers to predict geological features and bedding planes at new locations in a fast and accurate manner. To develop CBR, a database was developed based on 64 pressure sensors already installed on key bedding planes BP25, BP28, and BP31 on the Right Bank, including bedding plane elevations and coordinates. Thirteen (20%) of the most recent cases were selected to validate and evaluate the accuracy of the developed model, while the similarity was defined as the distance between previous cases and recent cases to predict the depth of significant BPs. The average difference between actual BP elevations and predicted elevations for above BPs was ±55cm, while the actual results showed that 69% of predicted elevations were within ±79 cm of actual BP elevations while 100% of predicted elevations for new cases were within ±99cm range. Eventually, the actual results will be used to develop the database and improve BPEP to perform as a learning machine to predict more accurate BP elevations for future sensor installations.Keywords: case-based reasoning, geological feature, geology, piezometer, pressure sensor, core logging, dam construction
Procedia PDF Downloads 802907 Isolation, Identification and Screening of Pectinase Producing Fungi Isolated from Apple (Malus Domestica)
Authors: Shameel Pervez, Saad Aziz Durrani, Ibatsam Khokhar
Abstract:
Pectinase is an enzyme that breaks down pectin, a compound responsible for structural integrity of the plant. Pectin is difficult to break down mechanically and the cost is very high, that is why many industries including food industries use pectinase enzyme produced by microbes for pectin breakdown. Apple (Malus domestica) is an important fruit in terms of market value. Every year, millions of apples are wasted due to post-harvest rot caused by fungi. Fungi are natural decomposers of our ecosystem and are infamous for post-harvest rot of apple fruit but at the same time they are prized for their high production of valuable extracellular enzymes such as pectinase. In this study, fungi belonging to different genus were isolated from rotten apples. Rotten samples of apple were picked from different markets of Lahore. After surface sterilization, the rotten parts were cut into small pieces and placed onto MEA media plates for three days. Afterwards, distinct colonies were picked and purified by sub-culturing. The isolates were identified to genus level through the study of basic colony morphology and microscopic features. The isolates were then subjected to screening for pectinase activity on MS media to compare pectinase production and were then subsequently tested for pathogenic activity through wound suspension method to evaluate the pathogenic activity of isolates in comparison with their pectinolytic activity. A total of twelve fungal strains were isolates from rotten apples. They were belonging to genus Penicillium, Alternaria, Paecilomyces and Rhizopus. Upon screening for pectinolytic activity, isolates Pen 1, Pen 4, and Rz showed high pectinolytic activity and were further subjected to DNA isolation and partial sequencing for species identification. The results of partial sequencing were combined with in-depth study of morphological features revealing Pen 1 as Penicillium janthinellum, Pen 4 as Penicillium griseofulvum, and Rz as Rhizopus microsporus. Pathogenic activity of all twelve isolates was evaluated. Penicillium spp. were highly pathogenic and destructive and same was the case with Paecilomyces sp. and Rhizopus sp. However, Alternaria spp. were found to be more consistent in their pathogenic activity, on all types of apples.Keywords: apple, pectinase, fungal pathogens, penicillium, rhizopus
Procedia PDF Downloads 632906 Virtual Reality Design Platform to Easily Create Virtual Reality Experiences
Authors: J. Casteleiro- Pitrez
Abstract:
The interest in Virtual Reality (VR) keeps increasing among the community of designers. To develop this type of immersive experience, the understanding of new processes and methodologies is as fundamental as its complex implementation which usually implies hiring a specialized team. In this paper, we introduce a case study, a platform that allows designers to easily create complex VR experiences, present its features, and its development process. We conclude that this platform provides a complete solution for the design and development of VR experiences, no-code needed.Keywords: creatives, designers, virtual reality, virtual reality design platform, virtual reality system, no-coding
Procedia PDF Downloads 1562905 Regression-Based Approach for Development of a Cuff-Less Non-Intrusive Cardiovascular Health Monitor
Authors: Pranav Gulati, Isha Sharma
Abstract:
Hypertension and hypotension are known to have repercussions on the health of an individual, with hypertension contributing to an increased probability of risk to cardiovascular diseases and hypotension resulting in syncope. This prompts the development of a non-invasive, non-intrusive, continuous and cuff-less blood pressure monitoring system to detect blood pressure variations and to identify individuals with acute and chronic heart ailments, but due to the unavailability of such devices for practical daily use, it becomes difficult to screen and subsequently regulate blood pressure. The complexities which hamper the steady monitoring of blood pressure comprises of the variations in physical characteristics from individual to individual and the postural differences at the site of monitoring. We propose to develop a continuous, comprehensive cardio-analysis tool, based on reflective photoplethysmography (PPG). The proposed device, in the form of an eyewear captures the PPG signal and estimates the systolic and diastolic blood pressure using a sensor positioned near the temporal artery. This system relies on regression models which are based on extraction of key points from a pair of PPG wavelets. The proposed system provides an edge over the existing wearables considering that it allows for uniform contact and pressure with the temporal site, in addition to minimal disturbance by movement. Additionally, the feature extraction algorithms enhance the integrity and quality of the extracted features by reducing unreliable data sets. We tested the system with 12 subjects of which 6 served as the training dataset. For this, we measured the blood pressure using a cuff based BP monitor (Omron HEM-8712) and at the same time recorded the PPG signal from our cardio-analysis tool. The complete test was conducted by using the cuff based blood pressure monitor on the left arm while the PPG signal was acquired from the temporal site on the left side of the head. This acquisition served as the training input for the regression model on the selected features. The other 6 subjects were used to validate the model by conducting the same test on them. Results show that the developed prototype can robustly acquire the PPG signal and can therefore be used to reliably predict blood pressure levels.Keywords: blood pressure, photoplethysmograph, eyewear, physiological monitoring
Procedia PDF Downloads 2772904 Sports Development in Nigeria
Authors: Bakari Mohammed
Abstract:
Sports performance and achievements have been the avenue through which great nations of the world exhibit their supremacy over others through sports development strategy. Effective sports development, therefore, requires variables like sports policy, sports funding, sports programme, sports facilities and sponsorship. The extent to what these variables are met shall no doubt affects the effectiveness of any sports development. Two distinguishing features of the Nigerian sports system are its central organization and its employment for specific socio-political objectives, it is against this backdrop that this paper will x-ray the politicization of sports which parallels sports development in the enhanced role of sports and in contrast with developed nations system and management.Keywords: sport development, sport policy, personnel, program, facilities, funding, sponsorship
Procedia PDF Downloads 524