Search results for: pair programming
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1334

Search results for: pair programming

674 Copy Effect Myopic Anisometropia in a Pair of Monozygotic Twins: A Case Report

Authors: Fatma Sümer

Abstract:

Introduction: This case report aims to report myopic anisometropia with copy-image in monozygotic twins. Methods: In February 2021, a 6-year-old identical twin was seen, who was referred to us with the diagnosis of amblyopia in their left eye from an external center. Both twins had a full ophthalmic examination, which included visual acuity testing, ocular motility testing, cycloplegic refraction, and fundus examination. Results: On examination, “copy image” myopic anisometropia was discovered. Twin 1 had anisometropia with myopic astigmatism in the left eye. His cycloplegic refraction was +1.00 (-0.75x 75) in the right eye and -8.0 (-1.50x175) in the left eye. Similarly, twin 2 had anisometropia with myopic astigmatism in the left eye. His cycloplegic refraction was -7.75 (-1.50x180) in the left eye and +1.25 (-0.75x90 ) in the right eye. The best-corrected visual acuity was 20/60 in the amblyopic eyes and 20/20 in the unaffected eyes. There was no ocular deviation. In either patient, a slit-lamp microscopic examination revealed no abnormalities in the anterior parts of either eye. Fundoscopic examination revealed no abnormalities. No abnormal ocular movements were demonstrated. Conclusion: As far as we have reviewed in the literature, previous studies with twins were mostly concerned with mirror-effect myopic anisometropia and myopic anisometropia, whereas ipsilateral amblyopia and anisometropia were not reported in monozygotic twins. This case underscores the possible genetic basis of myopic anisometropia.

Keywords: amblyopia, anisometropia, myopia, twins

Procedia PDF Downloads 150
673 Measuring Energy Efficiency Performance of Mena Countries

Authors: Azam Mohammadbagheri, Bahram Fathi

Abstract:

DEA has become a very popular method of performance measure, but it still suffers from some shortcomings. One of these shortcomings is the issue of having multiple optimal solutions to weights for efficient DMUs. The cross efficiency evaluation as an extension of DEA is proposed to avoid this problem. Lam (2010) is also proposed a mixed-integer linear programming formulation based on linear discriminate analysis and super efficiency method (MILP model) to avoid having multiple optimal solutions to weights. In this study, we modified MILP model to determine more suitable weight sets and also evaluate the energy efficiency of MENA countries as an application of the proposed model.

Keywords: data envelopment analysis, discriminate analysis, cross efficiency, MILP model

Procedia PDF Downloads 681
672 The Algorithm to Solve the Extend General Malfatti’s Problem in a Convex Circular Triangle

Authors: Ching-Shoei Chiang

Abstract:

The Malfatti’s Problem solves the problem of fitting 3 circles into a right triangle such that these 3 circles are tangent to each other, and each circle is also tangent to a pair of the triangle’s sides. This problem has been extended to any triangle (called general Malfatti’s Problem). Furthermore, the problem has been extended to have 1+2+…+n circles inside the triangle with special tangency properties among circles and triangle sides; we call it extended general Malfatti’s problem. In the extended general Malfatti’s problem, call it Tri(Tn), where Tn is the triangle number, there are closed-form solutions for Tri(T₁) (inscribed circle) problem and Tri(T₂) (3 Malfatti’s circles) problem. These problems become more complex when n is greater than 2. In solving Tri(Tn) problem, n>2, algorithms have been proposed to solve these problems numerically. With a similar idea, this paper proposed an algorithm to find the radii of circles with the same tangency properties. Instead of the boundary of the triangle being a straight line, we use a convex circular arc as the boundary and try to find Tn circles inside this convex circular triangle with the same tangency properties among circles and boundary Carc. We call these problems the Carc(Tn) problems. The CPU time it takes for Carc(T16) problem, which finds 136 circles inside a convex circular triangle with specified tangency properties, is less than one second.

Keywords: circle packing, computer-aided geometric design, geometric constraint solver, Malfatti’s problem

Procedia PDF Downloads 105
671 Influence of the Molecular Architecture of a Polycarboxylate-Based Superplasticizer on the Rheological and Physicomechanical Properties of Cement Pastes

Authors: Alya Harichane, Abderraouf Achour, Abdelbaki Benmounah

Abstract:

The main difficulty encountered in the formulation of high-performance concrete (HPC) consists in choosing the most efficient cement-superplasticizer pair allowing to obtain maximum water reduction, good workability of the concrete in the fresh state, and very good mechanical resistance in the hardened state. The aim of this work is to test the efficiency of three polycarboxylate ether-based superplasticizers (PCE) marketed in Algeria with CEMI 52.5 R cement and to study the effect of chemical structure of PCE on zeta potential, rheological and mechanical properties of cement pastes. The property of the polymers in cement was tested by a Malvern Zetasizer 2000 apparatus and VT 550 viscometer. Results showed that the zeta potential and its rheological properties are related to the molecular weight and the density carboxylic of PCE. The PCE with a moderate molecular weight and the highest carboxylic groups had the best dispersion (high value of zeta potential) and lowest viscosity. The effect of the chemical structure of PCEs on mechanical properties is evaluated by the formulation of cement mortar with these PCEs. The result shows that there is a correlation between the zeta potential of polymer and the compressive strength of cement paste.

Keywords: molecular weight, polycarboxylate-ether superplasticizer, rheology, zeta potential

Procedia PDF Downloads 81
670 Comparison of Chest Weight of Pure and Mixed Races Kabood 30-Day Squab

Authors: Sepehr Moradi, Mehdi Asadi Rad

Abstract:

The aim of this study is to evaluate and compare chest weight of pure and mixed races Kabood 30-day Pigeons to investigate about their sex, race, and some auxiliary variables. In this paper, 62 pieces of pigeons as 31 male and female pairs with equal age are studied randomly. A natural incubation was done from each pair. All produced chickens were slaughtered at 30 days age after 12 hours hunger. Then their chests were weighted by a scale with one gram precision. A covariance analysis was used since there were many auxiliary variables and unequal observations. SAS software was used for statistical analysis. Mean weight of chests in pure race (Kabood-Kabood) with 8 records, 123.8±32.3g and mixed races of Kabood-Namebar, Kabood-Parvazy, Kabood-Tizpar, Namebar-Kabood, Tizpar-Kabood, and Parvazi-Kabood with 8, 8, 6, 12, 10, and 10 records were 139.4±23.5, 7/122±23.8, 124.7±30.1, 50.3±29.3, 51.4±26.4, and 137±28.6 gr, respectively. Mean weight of 30-day chests in male and female sex were 87.3±2.5 and 82.7±2.6g, respectively. Difference chest weight of 30-day chests of Kabood-Kabood race with Kabood-Namebar, Kabood-Parvazi, Tizpar-Kabood, Kabood-Tizpar, Namebar-Kabood and Parvazi-Kabood mixed races was not significant. Effect of sex was also significant in 5% level (P<0.05), but mutual effect of sex and race was not significant. Auxiliary variable of father weight was significant in 1% level (p < 0.01), but auxiliary variable of mother weight was not significant. The results showed that most and least weights belonged to Kabood-Namebar and Namebar-Kabood.

Keywords: squab, Kabood race, 30-day chest weight, pigeons

Procedia PDF Downloads 148
669 Recognizing an Individual, Their Topic of Conversation and Cultural Background from 3D Body Movement

Authors: Gheida J. Shahrour, Martin J. Russell

Abstract:

The 3D body movement signals captured during human-human conversation include clues not only to the content of people’s communication but also to their culture and personality. This paper is concerned with automatic extraction of this information from body movement signals. For the purpose of this research, we collected a novel corpus from 27 subjects, arranged them into groups according to their culture. We arranged each group into pairs and each pair communicated with each other about different topics. A state-of-art recognition system is applied to the problems of person, culture, and topic recognition. We borrowed modeling, classification, and normalization techniques from speech recognition. We used Gaussian Mixture Modeling (GMM) as the main technique for building our three systems, obtaining 77.78%, 55.47%, and 39.06% from the person, culture, and topic recognition systems respectively. In addition, we combined the above GMM systems with Support Vector Machines (SVM) to obtain 85.42%, 62.50%, and 40.63% accuracy for person, culture, and topic recognition respectively. Although direct comparison among these three recognition systems is difficult, it seems that our person recognition system performs best for both GMM and GMM-SVM, suggesting that inter-subject differences (i.e. subject’s personality traits) are a major source of variation. When removing these traits from culture and topic recognition systems using the Nuisance Attribute Projection (NAP) and the Intersession Variability Compensation (ISVC) techniques, we obtained 73.44% and 46.09% accuracy from culture and topic recognition systems respectively.

Keywords: person recognition, topic recognition, culture recognition, 3D body movement signals, variability compensation

Procedia PDF Downloads 535
668 From Two-Way to Multi-Way: A Comparative Study for Map-Reduce Join Algorithms

Authors: Marwa Hussien Mohamed, Mohamed Helmy Khafagy

Abstract:

Map-Reduce is a programming model which is widely used to extract valuable information from enormous volumes of data. Map-reduce designed to support heterogeneous datasets. Apache Hadoop map-reduce used extensively to uncover hidden pattern like data mining, SQL, etc. The most important operation for data analysis is joining operation. But, map-reduce framework does not directly support join algorithm. This paper explains and compares two-way and multi-way map-reduce join algorithms for map reduce also we implement MR join Algorithms and show the performance of each phase in MR join algorithms. Our experimental results show that map side join and map merge join in two-way join algorithms has the longest time according to preprocessing step sorting data and reduce side cascade join has the longest time at Multi-Way join algorithms.

Keywords: Hadoop, MapReduce, multi-way join, two-way join, Ubuntu

Procedia PDF Downloads 477
667 High-Temperature Tribological Characterization of Nano-Sized Silicon Nitride + 5% Boron Nitride Ceramic Composite

Authors: Mohammad Farooq Wani

Abstract:

Tribological studies on nano-sized ß-silicon nitride+5% BN were carried out in dry air at high temperatures to clarify the lack of consensus in the bibliographic data concerning the Tribological behavior of Si3N4 ceramics and effect of doped hexagonal boron nitride on coefficient of friction and wear coefficient at different loads and elevated temperatures. The composites were prepared via high energy mechanical milling and subsequent spark plasma sintering using Y2O3 and Al2O3 as sintering additives. After sintering, the average crystalline size of Si3N4 was observed to be 50 nm. Tribological tests were performed with temperature and Friction coefficients 0.16 to 1.183 and 0.54 to 0.71 were observed for Nano-sized ß-silicon nitride+5% BN composite under normal load of 10N-70 N and over high temperature range of 350 ºC-550 ºC respectively. Specific wear coefficients from 1.33x 10-4 mm3N-1m-1 to 4.42x 10-4 mm3N-1m-1 were observed for Nano-sized Si3N4 + 5% BN composite against Si3N4 ball as tribo-pair counterpart over high temperature range of 350 ºC-550 ºC while as under normal load of 10N to70N Specific wear coefficients of 6.91x 10-4 mm3N-1m-1 to 1.70x 10-4 were observed. The addition of BN to the Si3N4 composite resulted in a slight reduction of the friction coefficient and lower values of wear coefficient.

Keywords: ceramics, tribology, friction and wear, solid lubrication

Procedia PDF Downloads 372
666 Comparative Analysis of Simulation-Based and Mixed-Integer Linear Programming Approaches for Optimizing Building Modernization Pathways Towards Decarbonization

Authors: Nico Fuchs, Fabian Wüllhorst, Laura Maier, Dirk Müller

Abstract:

The decarbonization of building stocks necessitates the modernization of existing buildings. Key measures for this include reducing energy demands through insulation of the building envelope, replacing heat generators, and installing solar systems. Given limited financial resources, it is impractical to modernize all buildings in a portfolio simultaneously; instead, prioritization of buildings and modernization measures for a given planning horizon is essential. Optimization models for modernization pathways can assist portfolio managers in this prioritization. However, modeling and solving these large-scale optimization problems, often represented as mixed-integer problems (MIP), necessitates simplifying the operation of building energy systems particularly with respect to system dynamics and transient behavior. This raises the question of which level of simplification remains sufficient to accurately account for realistic costs and emissions of building energy systems, ensuring a fair comparison of different modernization measures. This study addresses this issue by comparing a two-stage simulation-based optimization approach with a single-stage mathematical optimization in a mixed-integer linear programming (MILP) formulation. The simulation-based approach serves as a benchmark for realistic energy system operation but requires a restriction of the solution space to discrete choices of modernization measures, such as the sizing of heating systems. After calculating the operation of different energy systems in terms of the resulting final energy demands in simulation models on a first stage, the results serve as input for a second stage MILP optimization, where the design of each building in the portfolio is optimized. In contrast to the simulation-based approach, the MILP-based approach can capture a broader variety of modernization measures due to the efficiency of MILP solvers but necessitates simplifying the building energy system operation. Both approaches are employed to determine the cost-optimal design and dimensioning of several buildings in a portfolio to meet climate targets within limited yearly budgets, resulting in a modernization pathway for the entire portfolio. The comparison reveals that the MILP formulation successfully captures design decisions of building energy systems, such as the selection of heating systems and the modernization of building envelopes. However, the results regarding the optimal dimensioning of heating technologies differ from the results of the two-stage simulation-based approach, as the MILP model tends to overestimate operational efficiency, highlighting the limitations of the MILP approach.

Keywords: building energy system optimization, model accuracy in optimization, modernization pathways, building stock decarbonization

Procedia PDF Downloads 19
665 Evaluation and Compression of Different Language Transformer Models for Semantic Textual Similarity Binary Task Using Minority Language Resources

Authors: Ma. Gracia Corazon Cayanan, Kai Yuen Cheong, Li Sha

Abstract:

Training a language model for a minority language has been a challenging task. The lack of available corpora to train and fine-tune state-of-the-art language models is still a challenge in the area of Natural Language Processing (NLP). Moreover, the need for high computational resources and bulk data limit the attainment of this task. In this paper, we presented the following contributions: (1) we introduce and used a translation pair set of Tagalog and English (TL-EN) in pre-training a language model to a minority language resource; (2) we fine-tuned and evaluated top-ranking and pre-trained semantic textual similarity binary task (STSB) models, to both TL-EN and STS dataset pairs. (3) then, we reduced the size of the model to offset the need for high computational resources. Based on our results, the models that were pre-trained to translation pairs and STS pairs can perform well for STSB task. Also, having it reduced to a smaller dimension has no negative effect on the performance but rather has a notable increase on the similarity scores. Moreover, models that were pre-trained to a similar dataset have a tremendous effect on the model’s performance scores.

Keywords: semantic matching, semantic textual similarity binary task, low resource minority language, fine-tuning, dimension reduction, transformer models

Procedia PDF Downloads 200
664 Two-Dimensional Seismic Response of Concrete Gravity Dams Including Base Sliding

Authors: Djamel Ouzandja, Boualem Tiliouine

Abstract:

The safety evaluation of the concrete gravity dams subjected to seismic excitations is really very complex as the earthquake response of the concrete gravity dam depends upon its contraction joints with foundation soil. This paper presents the seismic response of concrete gravity dams considering friction contact and welded contact. Friction contact is provided using contact elements. Two-dimensional (2D) finite element model of Oued Fodda concrete gravity dam, located in Chlef at the west of Algeria, is used for this purpose. Linear and nonlinear analyses considering dam-foundation soil interaction are performed using ANSYS software. The reservoir water is modeled as added mass using the Westergaard approach. The Drucker-Prager model is preferred for dam and foundation rock in nonlinear analyses. The surface-to-surface contact elements based on the Coulomb's friction law are used to describe the friction. These contact elements use a target surface and a contact surface to form a contact pair. According to this study, the seismic analysis of concrete gravity dams including base sliding. When the friction contact is considered in joints, the base sliding displacement occurs along the dam-foundation soil contact interface. Besides, the base sliding may generally decrease the principal stresses in the dam.

Keywords: concrete gravity dam, dynamic soil-structure interaction, friction contact, sliding

Procedia PDF Downloads 405
663 Airport Investment Risk Assessment under Uncertainty

Authors: Elena M. Capitanul, Carlos A. Nunes Cosenza, Walid El Moudani, Felix Mora Camino

Abstract:

The construction of a new airport or the extension of an existing one requires massive investments and many times public private partnerships were considered in order to make feasible such projects. One characteristic of these projects is uncertainty with respect to financial and environmental impacts on the medium to long term. Another one is the multistage nature of these types of projects. While many airport development projects have been a success, some others have turned into a nightmare for their promoters. This communication puts forward a new approach for airport investment risk assessment. The approach takes explicitly into account the degree of uncertainty in activity levels prediction and proposes milestones for the different stages of the project for minimizing risk. Uncertainty is represented through fuzzy dual theory and risk management is performed using dynamic programming. An illustration of the proposed approach is provided.

Keywords: airports, fuzzy logic, risk, uncertainty

Procedia PDF Downloads 404
662 Leveraging Power BI for Advanced Geotechnical Data Analysis and Visualization in Mining Projects

Authors: Elaheh Talebi, Fariba Yavari, Lucy Philip, Lesley Town

Abstract:

The mining industry generates vast amounts of data, necessitating robust data management systems and advanced analytics tools to achieve better decision-making processes in the development of mining production and maintaining safety. This paper highlights the advantages of Power BI, a powerful intelligence tool, over traditional Excel-based approaches for effectively managing and harnessing mining data. Power BI enables professionals to connect and integrate multiple data sources, ensuring real-time access to up-to-date information. Its interactive visualizations and dashboards offer an intuitive interface for exploring and analyzing geotechnical data. Advanced analytics is a collection of data analysis techniques to improve decision-making. Leveraging some of the most complex techniques in data science, advanced analytics is used to do everything from detecting data errors and ensuring data accuracy to directing the development of future project phases. However, while Power BI is a robust tool, specific visualizations required by geotechnical engineers may have limitations. This paper studies the capability to use Python or R programming within the Power BI dashboard to enable advanced analytics, additional functionalities, and customized visualizations. This dashboard provides comprehensive tools for analyzing and visualizing key geotechnical data metrics, including spatial representation on maps, field and lab test results, and subsurface rock and soil characteristics. Advanced visualizations like borehole logs and Stereonet were implemented using Python programming within the Power BI dashboard, enhancing the understanding and communication of geotechnical information. Moreover, the dashboard's flexibility allows for the incorporation of additional data and visualizations based on the project scope and available data, such as pit design, rock fall analyses, rock mass characterization, and drone data. This further enhances the dashboard's usefulness in future projects, including operation, development, closure, and rehabilitation phases. Additionally, this helps in minimizing the necessity of utilizing multiple software programs in projects. This geotechnical dashboard in Power BI serves as a user-friendly solution for analyzing, visualizing, and communicating both new and historical geotechnical data, aiding in informed decision-making and efficient project management throughout various project stages. Its ability to generate dynamic reports and share them with clients in a collaborative manner further enhances decision-making processes and facilitates effective communication within geotechnical projects in the mining industry.

Keywords: geotechnical data analysis, power BI, visualization, decision-making, mining industry

Procedia PDF Downloads 85
661 Effect Of Tephrosia purpurea (Family: Fabaceae) Formulations On Oviposition By The Pulse Beetle Callosobruchus chinensis Linn.

Authors: Priyanka Jain, Meera Srivastava

Abstract:

Among important insect pests of stored grains, the pulse beetle Callosobruchus chinensis Linn. (Coleoptera: Bruchidae) is one such pest causing considerable damage to stored pulses. An effort was made to screen plant Tephrosia purpurea (Family: Fabaceae) for its efficacy against the said pest. The pulse beetle C. chinensis was raised on green gram Vigna radiata in incubators maintained at 28 ± 2°C and 70% RH. Different formulations using plant parts (root, stem, leaf and fruit) were employed in the form of aqueous suspension, aqueous extract and ether extract and the treatments were made using different dose concentrations, namely 1%, 2.5%, 5% and 10%, besides normal and control. Specific number of adult insects were released in muslin cloth covered beakers containing weighed green gram grains and treated with different dose concentrations (w/v). Observations for the number of eggs laid by the pest insect C. chinensis was recorded after three days of treatment and it was observed that in general all the treatments of the plant resulted in significant decrease in the eggs laid (no/pair) by the insect, suggesting that the selected plant has a potential to be used against C. chinensis.

Keywords: Callosobruchus chinensis, egg laying, Tephrosia purpurea, Fabaceae, plant formulations

Procedia PDF Downloads 339
660 Basics of SCADA Security: A Technical Approach

Authors: Michał Witas

Abstract:

This paper presents a technical approach to analysis of security of SCADA systems. Main goal of the paper is to make SCADA administrators aware of risks resulting from SCADA systems usage and to familiarize with methods that can be adopt to existing or planned system, to increase overall system security level. Because SCADA based systems become a industrial standard, more attention should be paid to the security of that systems. Industrial Control Systems (ICS) like SCADA are responsible for controlling crucial aspects of wide range of industrial processes. In pair with that responsibility, goes a lot of money that can be earned or lost – this fact is main reason of increased interest of attackers. Additionally ICS are often responsible for maintaining resources strategic from the point of view of national economy, like electricity (including nuclear power plants), heating, water resources or military facilities, so they can be targets of terrorist cybernetic attacks. Without proper risk analysis and management, vulnerabilities resulting from the usage of SCADA can be easily exploited by potential attacker. Paper is based mostly on own experience in systems security, gathered during academic studies and professional work in international company. As title suggests, it will cover only basics of topic, because every of points mentioned in the document can be base for additional research and papers.

Keywords: denial of service, SCADA, security policy, distributed network

Procedia PDF Downloads 368
659 Electrical Load Estimation Using Estimated Fuzzy Linear Parameters

Authors: Bader Alkandari, Jamal Y. Madouh, Ahmad M. Alkandari, Anwar A. Alnaqi

Abstract:

A new formulation of fuzzy linear estimation problem is presented. It is formulated as a linear programming problem. The objective is to minimize the spread of the data points, taking into consideration the type of the membership function of the fuzzy parameters to satisfy the constraints on each measurement point and to insure that the original membership is included in the estimated membership. Different models are developed for a fuzzy triangular membership. The proposed models are applied to different examples from the area of fuzzy linear regression and finally to different examples for estimating the electrical load on a busbar. It had been found that the proposed technique is more suited for electrical load estimation, since the nature of the load is characterized by the uncertainty and vagueness.

Keywords: fuzzy regression, load estimation, fuzzy linear parameters, electrical load estimation

Procedia PDF Downloads 532
658 Microstructure Analysis and Multiple Photoluminescence in High Temperature Electronic Conducting InZrZnO Thin Films

Authors: P. Jayaram, Prasoon Prasannan, N. K. Deepak, P. P. Pradyumnan

Abstract:

Indium and Zirconium co doped zinc oxide (InZrZnO) thin films are prepared by chemical spray pyrolysis method on pre-heated quartz substrates. The films are subjected to vacuum annealing at 400ᵒC for three hours in an appropriate air (10-5mbar) ambience after deposition. X-ray diffraction, Scanning electron microscopy, energy dispersive spectra and photoluminescence are used to characterize the films. Temperature dependent electrical measurements are conducted on the films and the films exhibit exceptional conductivity at higher temperatures. XRD analysis shows that all the films prepared in this work have hexagonal wurtzite structure. The average crystallite sizes of the films were calculated using Scherrer’s formula, and uniform deformation model (UDM) of Williamson-Hall method is used to establish the micro-strain values. The dislocation density is determined from the Williamson and Smallman’s formula. Intense, broad and strongly coupled multiple photoluminescence were observed from photoluminescence spectra. PL indicated relatively high concentration defective oxygen and Zn vacancies in the film composition. Strongly coupled ultraviolet near blue emissions authenticate that the dopants are capable of inducing modulated free excitonic (FX), donor accepter pair (DAP) and longitudinal optical phonon emissions in thin films.

Keywords: PL, SEM, TCOs, thin films, XRD

Procedia PDF Downloads 229
657 A Neural Network Classifier for Estimation of the Degree of Infestation by Late Blight on Tomato Leaves

Authors: Gizelle K. Vianna, Gabriel V. Cunha, Gustavo S. Oliveira

Abstract:

Foliage diseases in plants can cause a reduction in both quality and quantity of agricultural production. Intelligent detection of plant diseases is an essential research topic as it may help monitoring large fields of crops by automatically detecting the symptoms of foliage diseases. This work investigates ways to recognize the late blight disease from the analysis of tomato digital images, collected directly from the field. A pair of multilayer perceptron neural network analyzes the digital images, using data from both RGB and HSL color models, and classifies each image pixel. One neural network is responsible for the identification of healthy regions of the tomato leaf, while the other identifies the injured regions. The outputs of both networks are combined to generate the final classification of each pixel from the image and the pixel classes are used to repaint the original tomato images by using a color representation that highlights the injuries on the plant. The new images will have only green, red or black pixels, if they came from healthy or injured portions of the leaf, or from the background of the image, respectively. The system presented an accuracy of 97% in detection and estimation of the level of damage on the tomato leaves caused by late blight.

Keywords: artificial neural networks, digital image processing, pattern recognition, phytosanitary

Procedia PDF Downloads 323
656 Analyzing the Practicality of Drawing Inferences in Automation of Commonsense Reasoning

Authors: Chandan Hegde, K. Ashwini

Abstract:

Commonsense reasoning is the simulation of human ability to make decisions during the situations that we encounter every day. It has been several decades since the introduction of this subfield of artificial intelligence, but it has barely made some significant progress. The modern computing aids also have remained impotent in this regard due to the absence of a strong methodology towards commonsense reasoning development. Among several accountable reasons for the lack of progress, drawing inference out of commonsense knowledge-base stands out. This review paper emphasizes on a detailed analysis of representation of reasoning uncertainties and feasible prospects of programming aids for drawing inferences. Also, the difficulties in deducing and systematizing commonsense reasoning and the substantial progress made in reasoning that influences the study have been discussed. Additionally, the paper discusses the possible impacts of an effective inference technique in commonsense reasoning.

Keywords: artificial intelligence, commonsense reasoning, knowledge base, uncertainty in reasoning

Procedia PDF Downloads 181
655 Energy Absorption Characteristic of a Coupler Rubber Buffer Used in Rail Vehicles

Authors: Zhixiang Li, Shuguang Yao, Wen Ma

Abstract:

Coupler rubber buffer has been widely applied on the high-speed trains and the main function of the rubber buffer is dissipating the impact energy between vehicles. The rubber buffer consists of two groups of rubbers, which are both pre-compressed and then installed into the frame body. This paper focuses on the energy absorption characteristics of the rubber buffers particularly. Firstly, the quasi-static compression tests were carried out for 1 and 3 pairs of rubber sheets and some energy absorption responses relationship, i.e. Eabn = n×Eab1, Edissn = n×Ediss1, and Ean = Ea1, were obtained. Next, a series of quasi-static tests were performed for 1 pair of rubber sheet to investigate the energy absorption performance with different compression ratio of the rubber buffers. Then the impact tests with five impact velocities were conducted and the coupler knuckle was destroyed when the impact velocity was 10.807 km/h. The impact tests results showed that with the increase of impact velocity, the Eab, Ediss and Ea of rear buffer increased a lot, but the three responses of front buffer had not much increase. Finally, the results of impact tests and quasi-static tests were contrastively analysed and the results showed that with the increase of the stroke, the values of Eab, Ediss, and Ea were all increase. However, the increasing rates of impact tests were all larger than that of quasi-static tests. The maximum value of Ea was 68.76% in impact tests, it was a relatively high value for vehicle coupler buffer. The energy capacity of the rear buffer was determined for dynamic loading, it was 22.98 kJ.

Keywords: rubber buffer, coupler, energy absorption, impact tests

Procedia PDF Downloads 190
654 Malware Beaconing Detection by Mining Large-scale DNS Logs for Targeted Attack Identification

Authors: Andrii Shalaginov, Katrin Franke, Xiongwei Huang

Abstract:

One of the leading problems in Cyber Security today is the emergence of targeted attacks conducted by adversaries with access to sophisticated tools. These attacks usually steal senior level employee system privileges, in order to gain unauthorized access to confidential knowledge and valuable intellectual property. Malware used for initial compromise of the systems are sophisticated and may target zero-day vulnerabilities. In this work we utilize common behaviour of malware called ”beacon”, which implies that infected hosts communicate to Command and Control servers at regular intervals that have relatively small time variations. By analysing such beacon activity through passive network monitoring, it is possible to detect potential malware infections. So, we focus on time gaps as indicators of possible C2 activity in targeted enterprise networks. We represent DNS log files as a graph, whose vertices are destination domains and edges are timestamps. Then by using four periodicity detection algorithms for each pair of internal-external communications, we check timestamp sequences to identify the beacon activities. Finally, based on the graph structure, we infer the existence of other infected hosts and malicious domains enrolled in the attack activities.

Keywords: malware detection, network security, targeted attack, computational intelligence

Procedia PDF Downloads 259
653 Some Pertinent Issues and Considerations on CBSE

Authors: Anil Kumar Tripathi, Ratneshwer Gupta

Abstract:

All the software engineering researches and best industry practices aim at providing software products with high degree of quality and functionality at low cost and less time. These requirements are addressed by the Component Based Software Engineering (CBSE) as well. CBSE, which deals with the software construction by components’ assembly, is a revolutionary extension of Software Engineering. CBSE must define and describe processes to assure timely completion of high quality software systems that are composed of a variety of pre built software components. Though these features provide distinct and visible benefits in software design and programming, they also raise some challenging problems. The aim of this work is to summarize the pertinent issues and considerations in CBSE to make an understanding in forms of concepts and observations that may lead to development of newer ways of dealing with the problems and challenges in CBSE.

Keywords: software component, component based software engineering, software process, testing, maintenance

Procedia PDF Downloads 393
652 Mathematical Modeling and Algorithms for the Capacitated Facility Location and Allocation Problem with Emission Restriction

Authors: Sagar Hedaoo, Fazle Baki, Ahmed Azab

Abstract:

In supply chain management, network design for scalable manufacturing facilities is an emerging field of research. Facility location allocation assigns facilities to customers to optimize the overall cost of the supply chain. To further optimize the costs, capacities of these facilities can be changed in accordance with customer demands. A mathematical model is formulated to fully express the problem at hand and to solve small-to-mid range instances. A dedicated constraint has been developed to restrict emissions in line with the Kyoto protocol. This problem is NP-Hard; hence, a simulated annealing metaheuristic has been developed to solve larger instances. A case study on the USA-Canada cross border crossing is used.

Keywords: emission, mixed integer linear programming, metaheuristic, simulated annealing

Procedia PDF Downloads 303
651 Intelligent Rescheduling Trains for Air Pollution Management

Authors: Kainat Affrin, P. Reshma, G. Narendra Kumar

Abstract:

Optimization of timetable is the need of the day for the rescheduling and routing of trains in real time. Trains are scheduled in parallel with the road transport vehicles to the same destination. As the number of trains is restricted due to single track, customers usually opt for road transport to use frequently. The air pollution increases as the density of vehicles on road transport is increased. Use of an alternate mode of transport like train helps in reducing air-pollution. This paper mainly aims at attracting the passengers to Train transport by proper rescheduling of trains using hybrid of stop-skip algorithm and iterative convex programming algorithm. Rescheduling of train bi-directionally is achieved on a single track with dynamic dual time and varying stops. Introduction of more trains attract customers to use rail transport frequently, thereby decreasing the pollution. The results are simulated using Network Simulator (NS-2).

Keywords: air pollution, AODV, re-scheduling, WSNs

Procedia PDF Downloads 353
650 Object-Oriented Program Comprehension by Identification of Software Components and Their Connexions

Authors: Abdelhak-Djamel Seriai, Selim Kebir, Allaoua Chaoui

Abstract:

During the last decades, object oriented program- ming has been massively used to build large-scale systems. However, evolution and maintenance of such systems become a laborious task because of the lack of object oriented programming to offer a precise view of the functional building blocks of the system. This lack is caused by the fine granularity of classes and objects. In this paper, we use a post object-oriented technology namely software components, to propose an approach based on the identification of the functional building blocks of an object oriented system by analyzing its source code. These functional blocks are specified as software components and the result is a multi-layer component based software architecture.

Keywords: software comprehension, software component, object oriented, software architecture, reverse engineering

Procedia PDF Downloads 408
649 The Malfatti’s Problem in Reuleaux Triangle

Authors: Ching-Shoei Chiang

Abstract:

The Malfatti’s Problem is to ask for fitting 3 circles into a right triangle such that they are tangent to each other, and each circle is also tangent to a pair of the triangle’s side. This problem has been extended to any triangle (called general Malfatti’s Problem). Furthermore, the problem has been extended to have 1+2+…+n circles, we call it extended general Malfatti’s problem, these circles whose tangency graph, using the center of circles as vertices and the edge connect two circles center if these two circles tangent to each other, has the structure as Pascal’s triangle, and the exterior circles of these circles tangent to three sides of the triangle. In the extended general Malfatti’s problem, there are closed-form solutions for n=1, 2, and the problem becomes complex when n is greater than 2. In solving extended general Malfatti’s problem (n>2), we initially give values to the radii of all circles. From the tangency graph and current radii, we can compute angle value between two vectors. These vectors are from the center of the circle to the tangency points with surrounding elements, and these surrounding elements can be the boundary of the triangle or other circles. For each circle C, there are vectors from its center c to its tangency point with its neighbors (count clockwise) pi, i=0, 1,2,..,n. We add all angles between cpi to cp(i+1) mod (n+1), i=0,1,..,n, call it sumangle(C) for circle C. Using sumangle(C), we can reduce/enlarge the radii for all circles in next iteration, until sumangle(C) is equal to 2πfor all circles. With a similar idea, this paper proposed an algorithm to find the radii of circles whose tangency has the structure of Pascal’s triangle, and the exterior circles of these circles are tangent to the unit Realeaux Triangle.

Keywords: Malfatti’s problem, geometric constraint solver, computer-aided geometric design, circle packing, data visualization

Procedia PDF Downloads 124
648 Automatic Detection of Defects in Ornamental Limestone Using Wavelets

Authors: Maria C. Proença, Marco Aniceto, Pedro N. Santos, José C. Freitas

Abstract:

A methodology based on wavelets is proposed for the automatic location and delimitation of defects in limestone plates. Natural defects include dark colored spots, crystal zones trapped in the stone, areas of abnormal contrast colors, cracks or fracture lines, and fossil patterns. Although some of these may or may not be considered as defects according to the intended use of the plate, the goal is to pair each stone with a map of defects that can be overlaid on a computer display. These layers of defects constitute a database that will allow the preliminary selection of matching tiles of a particular variety, with specific dimensions, for a requirement of N square meters, to be done on a desktop computer rather than by a two-hour search in the storage park, with human operators manipulating stone plates as large as 3 m x 2 m, weighing about one ton. Accident risks and work times are reduced, with a consequent increase in productivity. The base for the algorithm is wavelet decomposition executed in two instances of the original image, to detect both hypotheses – dark and clear defects. The existence and/or size of these defects are the gauge to classify the quality grade of the stone products. The tuning of parameters that are possible in the framework of the wavelets corresponds to different levels of accuracy in the drawing of the contours and selection of the defects size, which allows for the use of the map of defects to cut a selected stone into tiles with minimum waste, according the dimension of defects allowed.

Keywords: automatic detection, defects, fracture lines, wavelets

Procedia PDF Downloads 245
647 Machine Learning Analysis of Student Success in Introductory Calculus Based Physics I Course

Authors: Chandra Prayaga, Aaron Wade, Lakshmi Prayaga, Gopi Shankar Mallu

Abstract:

This paper presents the use of machine learning algorithms to predict the success of students in an introductory physics course. Data having 140 rows pertaining to the performance of two batches of students was used. The lack of sufficient data to train robust machine learning models was compensated for by generating synthetic data similar to the real data. CTGAN and CTGAN with Gaussian Copula (Gaussian) were used to generate synthetic data, with the real data as input. To check the similarity between the real data and each synthetic dataset, pair plots were made. The synthetic data was used to train machine learning models using the PyCaret package. For the CTGAN data, the Ada Boost Classifier (ADA) was found to be the ML model with the best fit, whereas the CTGAN with Gaussian Copula yielded Logistic Regression (LR) as the best model. Both models were then tested for accuracy with the real data. ROC-AUC analysis was performed for all the ten classes of the target variable (Grades A, A-, B+, B, B-, C+, C, C-, D, F). The ADA model with CTGAN data showed a mean AUC score of 0.4377, but the LR model with the Gaussian data showed a mean AUC score of 0.6149. ROC-AUC plots were obtained for each Grade value separately. The LR model with Gaussian data showed consistently better AUC scores compared to the ADA model with CTGAN data, except in two cases of the Grade value, C- and A-.

Keywords: machine learning, student success, physics course, grades, synthetic data, CTGAN, gaussian copula CTGAN

Procedia PDF Downloads 39
646 Working Effectively with Muslim Communities in the West

Authors: Lisa Tribuzio

Abstract:

This paper explores the complexity of working with Muslim communities in Australia. It will draw upon the notions of belonging, social inclusion and effective community programming to engage Muslim communities in Western environments given the current global political climate. Factors taken into consideration for effective engagement include: family engagement, considering key practices such as Ramadan, fasting and prayer and food requirements, gender relations, core values around faith and spirituality, considering attitudes towards self disclosure in a counseling setting and the notion of Us and Them in the media and systems and its effect on minority communities. It will explore recent research in the field from Australian researchers as well as recommendations from United Nations in working with Muslim communities. It will also explore current practice models applied in Australia in engaging effectively with diverse communities and addressing racism and discrimination in innovative ways.

Keywords: Muslim, cultural diversity, social inclusion, racism

Procedia PDF Downloads 414
645 The Effect of Non-Normality on CB-SEM and PLS-SEM Path Estimates

Authors: Z. Jannoo, B. W. Yap, N. Auchoybur, M. A. Lazim

Abstract:

The two common approaches to Structural Equation Modeling (SEM) are the Covariance-Based SEM (CB-SEM) and Partial Least Squares SEM (PLS-SEM). There is much debate on the performance of CB-SEM and PLS-SEM for small sample size and when distributions are non-normal. This study evaluates the performance of CB-SEM and PLS-SEM under normality and non-normality conditions via a simulation. Monte Carlo Simulation in R programming language was employed to generate data based on the theoretical model with one endogenous and four exogenous variables. Each latent variable has three indicators. For normal distributions, CB-SEM estimates were found to be inaccurate for small sample size while PLS-SEM could produce the path estimates. Meanwhile, for a larger sample size, CB-SEM estimates have lower variability compared to PLS-SEM. Under non-normality, CB-SEM path estimates were inaccurate for small sample size. However, CB-SEM estimates are more accurate than those of PLS-SEM for sample size of 50 and above. The PLS-SEM estimates are not accurate unless sample size is very large.

Keywords: CB-SEM, Monte Carlo simulation, normality conditions, non-normality, PLS-SEM

Procedia PDF Downloads 402