Search results for: principal objects
1657 The Musical Imagination: Re-Imagining a Sound Education through Musical Boundary Play
Authors: Michael J. Cutler
Abstract:
This paper presents what musical boundary play can look like when beginning music learners work with professional musicians with an emphasis on composition. Music education can be re-imagined through the lenses of boundary objects and boundary play by engaging non-professional musicians in collaborative sound creation, improvisation and composition along with professional musicians. To the author’s best knowledge, no similar study exists on boundary objects and boundary play in music education. The literature reviewed for this paper explores the epistemological perspectives connected to music education and situates musical boundary play as an alternative approach to the more prevalent paradigms of music education in K-12 settings. A qualitative multiple-case study design was chosen to seek an in-depth understanding of the role of boundary objects and musical boundary play. The constant comparative method was utilized in analyzing and interpreting the data resulting in the development of effective, transferable theory. The study gathered relevant data using audio and video recordings of musical boundary play, artifacts, interviews, and observations. Findings from this study offer insight into the development of a more inclusive music education and yield a pedagogical framework for music education based on musical boundary play. Through the facilitation of musical boundary play, it is possible for music learners to experience musical sound creation, improvisation and composition in the same way an instrumentalist or vocalist would without the acquisition of complex component operations required to play a traditional instrument or sing in a proficient manner.Keywords: boundary play, boundary objects, music education, music pedagogy, musical boundary play
Procedia PDF Downloads 1261656 Timing Equation for Capturing Satellite Thermal Images
Authors: Toufic Abd El-Latif Sadek
Abstract:
The Asphalt object represents the asphalted areas, like roads. The best original data of thermal images occurred at a specific time during the days of the year, by preventing the gaps in times which give the close and same brightness from different objects, using seven sample objects, asphalt, concrete, metal, rock, dry soil, vegetation, and water. It has been found in this study a general timing equation for capturing satellite thermal images at different locations, depends on a fixed time the sunrise and sunset; Capture Time= Tcap =(TM*TSR) ±TS.Keywords: asphalt, satellite, thermal images, timing equation
Procedia PDF Downloads 3501655 A Less Complexity Deep Learning Method for Drones Detection
Authors: Mohamad Kassab, Amal El Fallah Seghrouchni, Frederic Barbaresco, Raed Abu Zitar
Abstract:
Detecting objects such as drones is a challenging task as their relative size and maneuvering capabilities deceive machine learning models and cause them to misclassify drones as birds or other objects. In this work, we investigate applying several deep learning techniques to benchmark real data sets of flying drones. A deep learning paradigm is proposed for the purpose of mitigating the complexity of those systems. The proposed paradigm consists of a hybrid between the AdderNet deep learning paradigm and the Single Shot Detector (SSD) paradigm. The goal was to minimize multiplication operations numbers in the filtering layers within the proposed system and, hence, reduce complexity. Some standard machine learning technique, such as SVM, is also tested and compared to other deep learning systems. The data sets used for training and testing were either complete or filtered in order to remove the images with mall objects. The types of data were RGB or IR data. Comparisons were made between all these types, and conclusions were presented.Keywords: drones detection, deep learning, birds versus drones, precision of detection, AdderNet
Procedia PDF Downloads 1821654 Genetic Variability and Principal Component Analysis in Eggplant (Solanum melongena)
Authors: M. R. Naroui Rad, A. Ghalandarzehi, J. A. Koohpayegani
Abstract:
Nine advanced cultivars and lines were planted in transplant trays on March, 2013. In mid-April 2014, nine cultivars and lines were taken from the seedling trays and were evaluated and compared in an experiment in form of a completely randomized block design with three replications at the Agricultural Research Station, Zahak. The results of the analysis of variance showed that there was a significant difference between the studied cultivars in terms of average fruit weight, fruit length, fruit diameter, ratio of fruit length to its diameter, the relative number of seeds per fruit, and each plant yield. The total yield of Sohrab and Y6 line with and an average of 41.9 and 36.7 t/ ha allocated the highest yield respectively to themselves. The results of simple correlation between the analyzed traits showed the final yield was affected by the average fruit weight due to direct and indirect effects of fruit weight and plant yield on the final yield. The genotypic and heritability values were high for fruit weight, fruit length and number of seed per fruit. The first two principal components accounted for 81.6% of the total variation among the characters describing genotypes.Keywords: eggplant, principal component, variation, path analysis
Procedia PDF Downloads 2311653 Principal Component Analysis of Body Weight and Morphometric Traits of New Zealand Rabbits Raised under Semi-Arid Condition in Nigeria
Authors: Emmanuel Abayomi Rotimi
Abstract:
Context: Rabbits production plays important role in increasing animal protein supply in Nigeria. Rabbit production provides a cheap, affordable, and healthy source of meat. The growth of animals involves an increase in body weight, which can change the conformation of various parts of the body. Live weight and linear measurements are indicators of growth rate in rabbits and other farm animals. Aims: This study aimed to define the body dimensions of New Zealand rabbits and also to investigate the morphometric traits variables that contribute to body conformation by the use of principal component analysis (PCA). Methods: Data were obtained from 80 New Zealand rabbits (40 bucks and 40 does) raised in Livestock Teaching and Research Farm, Federal University Dutsinma. Data were taken on body weight (BWT), body length (BL), ear length (EL), tail length (TL), heart girth (HG) and abdominal circumference (AC). Data collected were subjected to multivariate analysis using SPSS 20.0 statistical package. Key results: The descriptive statistics showed that the mean BWT, BL, EL, TL, HG, and AC were 0.91kg, 27.34cm, 10.24cm, 8.35cm, 19.55cm and 21.30cm respectively. Sex showed significant (P<0.05) effect on all the variables examined, with higher values recorded for does. The phenotypic correlation coefficient values (r) between the morphometric traits were all positive and ranged from r = 0.406 (between EL and BL) to r = 0.909 (between AC and HG). HG is the most correlated with BWT (r = 0.786). The principal component analysis with variance maximizing orthogonal rotation was used to extract the components. Two principal components (PCs) from the factor analysis of morphometric traits explained about 80.42% of the total variance. PC1 accounted for 64.46% while PC2 accounted for 15.97% of the total variances. Three variables, representing body conformation, loaded highest in PC1. PC1 had the highest contribution (64.46%) to the total variance, and it is regarded as body conformation traits. Conclusions: This component could be used as selection criteria for improving body weight of rabbits.Keywords: conformation, multicollinearity, multivariate, rabbits and principal component analysis
Procedia PDF Downloads 1301652 Structure-Constructivism in the Philosophy of Mathematics
Authors: Jeansou Moun
Abstract:
This study argues that constructivism and structuralism, which have been the two important schools of mathematical philosophy since the mid-19th century, can and should be synthesized into structure-constructivism. In fact, the philosophy of mathematics is divided into more than ten schools depending on the point of view. However, the biggest trend is Platonism which claims that mathematical objects are "abstract entities" that exists independently of the human mind and material objects. Its opposite is constructivism. According to the latter, mathematical objects are products of the construction of the human mind. However, whether the basis of the construction is a logical device, a symbolic system, or an empirical perception, it is subdivided into logicism, formalism, and intuitionism. However, these three schools themselves are further subdivided into various variants, and among them, structuralism, which emerged in the mid-20th century, is receiving the most attention. On the other hand, structuralism which emphasizes structure instead of individual objects, is divided into non-eliminative structuralism, which supports the a priori of structure, and non-eliminative structuralism, which rejects any abstract entity. In this context, it is believed that the structure itself is not an a priori entity but a result of the construction of the cognitive subject and that no object has ever been given to us in its full meaning from the outset. In other words, concepts are progressively structured through a dialectical cycle between sensory perception, imagination (abstraction), concepts, judgments, and reasoning. Symbols are needed for formal operation. However, without concrete manipulation, the formal operation cannot have any meaning. However, when formal structurization is achieved, the reality (object) itself is also newly structured. This is the "structure-constructivism".Keywords: philosophy of mathematics, platonism, logicism, formalism, constructivism, structuralism, structure-constructivism
Procedia PDF Downloads 951651 Compass Bar: A Visualization Technique for Out-of-View-Objects in Head-Mounted Displays
Authors: Alessandro Evangelista, Vito M. Manghisi, Michele Gattullo, Enricoandrea Laviola
Abstract:
In this work, we propose a custom visualization technique for Out-Of-View-Objects in Virtual and Augmented Reality applications using Head Mounted Displays. In the last two decades, Augmented Reality (AR) and Virtual Reality (VR) technologies experienced a remarkable growth of applications for navigation, interaction, and collaboration in different types of environments, real or virtual. Both environments can be potentially very complex, as they can include many virtual objects located in different places. Given the natural limitation of the human Field of View (about 210° horizontal and 150° vertical), humans cannot perceive objects outside this angular range. Moreover, despite recent technological advances in AR e VR Head-Mounted Displays (HMDs), these devices still suffer from a limited Field of View, especially regarding Optical See-Through displays, thus greatly amplifying the challenge of visualizing out-of-view objects. This problem is not negligible when the user needs to be aware of the number and the position of the out-of-view objects in the environment. For instance, during a maintenance operation on a construction site where virtual objects serve to improve the dangers' awareness. Providing such information can enhance the comprehension of the scene, enable fast navigation and focused search, and improve users' safety. In our research, we investigated how to represent out-of-view-objects in HMD User Interfaces (UI). Inspired by commercial video games such as Call of Duty Modern Warfare, we designed a customized Compass. By exploiting the Unity 3D graphics engine, we implemented our custom solution that can be used both in AR and VR environments. The Compass Bar consists of a graduated bar (in degrees) at the top center of the UI. The values of the bar range from -180 (far left) to +180 (far right), the zero is placed in front of the user. Two vertical lines on the bar show the amplitude of the user's field of view. Every virtual object within the scene is represented onto the compass bar as a specific color-coded proxy icon (a circular ring with a colored dot at its center). To provide the user with information about the distance, we implemented a specific algorithm that increases the size of the inner dot as the user approaches the virtual object (i.e., when the user reaches the object, the dot fills the ring). This visualization technique for out-of-view objects has some advantages. It allows users to be quickly aware of the number and the position of the virtual objects in the environment. For instance, if the compass bar displays the proxy icon at about +90, users will immediately know that the virtual object is to their right and so on. Furthermore, by having qualitative information about the distance, users can optimize their speed, thus gaining effectiveness in their work. Given the small size and position of the Compass Bar, our solution also helps lessening the occlusion problem thus increasing user acceptance and engagement. As soon as the lockdown measures will allow, we will carry out user-tests comparing this solution with other state-of-the-art existing ones such as 3D Radar, SidebARs and EyeSee360.Keywords: augmented reality, situation awareness, virtual reality, visualization design
Procedia PDF Downloads 1271650 [Keynote Talk]: Animation of Objects on the Website by Application of CSS3 Language
Authors: Vladimir Simovic, Matija Varga, Robert Svetlacic
Abstract:
Scientific work analytically explores and demonstrates techniques that can animate objects and geometric characters using CSS3 language by applying proper formatting and positioning of elements. This paper presents examples of optimum application of the CSS3 descriptive language when generating general web animations (e.g., billiards and movement of geometric characters, etc.). The paper presents analytically, the optimal development and animation design with the frames within which the animated objects are. The originally developed content is based on the upgrading of existing CSS3 descriptive language animations with more complex syntax and project-oriented work. The purpose of the developed animations is to provide an overview of the interactive features of CSS3 descriptive language design for computer games and the animation of important analytical data based on the web view. It has been analytically demonstrated that CSS3 as a descriptive language allows inserting of various multimedia elements into websites for public and internal sites.Keywords: web animation recording, KML GML HTML5 forms, Cascading Style Sheets 3, Google Earth Professional
Procedia PDF Downloads 3351649 The Contribution of Lower Visual Channels and Evolutionary Origin of the Tunnel Effect
Authors: Shai Gabay
Abstract:
The tunnel effect describes the phenomenon where a moving object seems to persist even when temporarily hidden from view. Numerous studies indicate that humans, infants, and nonhuman primates possess object persistence, relying on spatiotemporal cues to track objects that are dynamically occluded. While this ability is associated with neural activity in the cerebral neocortex of humans and mammals, the role of subcortical mechanisms remains ambiguous. In our current investigation, we explore the functional contribution of monocular aspects of the visual system, predominantly subcortical, to the representation of occluded objects. This is achieved by manipulating whether the reappearance of an object occurs in the same or different eye from its disappearance. Additionally, we employ Archerfish, renowned for their precision in dislodging insect prey with water jets, as a phylogenetic model to probe the evolutionary origins of the tunnel effect. Our findings reveal the active involvement of subcortical structures in the mental representation of occluded objects, a process evident even in species that do not possess cortical tissue.Keywords: archerfish, tunnel effect, mental representations, monocular channels, subcortical structures
Procedia PDF Downloads 451648 Application of FT-NIR Spectroscopy and Electronic Nose in On-line Monitoring of Dough Proofing
Authors: Madhuresh Dwivedi, Navneet Singh Deora, Aastha Deswal, H. N. Mishra
Abstract:
FT-NIR spectroscopy and electronic nose was used to study the kinetics of dough proofing. Spectroscopy was conducted with an optic probe in the diffuse reflectance mode. The dough leavening was carried out at different temperatures (25 and 35°C) and constant RH (80%). Spectra were collected in the range of wave numbers from 12,000 to 4,000 cm-1 directly on the samples, every 5 min during proofing, up to 2 hours. NIR spectra were corrected for scatter effect and second order derivatization was done to transform the spectra. Principal component analysis (PCA) was applied for the leavening process and process kinetics was calculated. PCA was performed on data set and loadings were calculated. For leavening, four absorption zones (8,950-8,850, 7,200-6,800, 5,250-5,150 and 4,700-4,250 cm-1) were involved in describing the process. Simultaneously electronic nose was also used for understanding the development of odour compounds during fermentation. The electronic nose was able to differential the sample on the basis of aroma generation at different time during fermentation. In order to rapidly differentiate samples based on odor, a Principal component analysis is performed and successfully demonstrated in this study. The result suggests that electronic nose and FT-NIR spectroscopy can be utilized for the online quality control of the fermentation process during leavening of bread dough.Keywords: FT-NIR, dough, e-nose, proofing, principal component analysis
Procedia PDF Downloads 3901647 Towards Update a Road Map Solution: Use of Information Obtained by the Extraction of Road Network and Its Nodes from a Satellite Image
Authors: Z. Nougrara, J. Meunier
Abstract:
In this paper, we present a new approach for extracting roads, there road network and its nodes from satellite image representing regions in Algeria. Our approach is related to our previous research work. It is founded on the information theory and the mathematical morphology. We therefore have to define objects as sets of pixels and to study the shape of these objects and the relations that exist between them. The main interest of this study is to solve the problem of the automatic mapping from satellite images. This study is thus applied for that the geographical representation of the images is as near as possible to the reality.Keywords: nodes, road network, satellite image, updating a road map
Procedia PDF Downloads 4251646 Collision Detection Algorithm Based on Data Parallelism
Authors: Zhen Peng, Baifeng Wu
Abstract:
Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.Keywords: data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability
Procedia PDF Downloads 2891645 Detection of Image Blur and Its Restoration for Image Enhancement
Authors: M. V. Chidananda Murthy, M. Z. Kurian, H. S. Guruprasad
Abstract:
Image restoration in the process of communication is one of the emerging fields in the image processing. The motion analysis processing is the simplest case to detect motion in an image. Applications of motion analysis widely spread in many areas such as surveillance, remote sensing, film industry, navigation of autonomous vehicles, etc. The scene may contain multiple moving objects, by using motion analysis techniques the blur caused by the movement of the objects can be enhanced by filling-in occluded regions and reconstruction of transparent objects, and it also removes the motion blurring. This paper presents the design and comparison of various motion detection and enhancement filters. Median filter, Linear image deconvolution, Inverse filter, Pseudoinverse filter, Wiener filter, Lucy Richardson filter and Blind deconvolution filters are used to remove the blur. In this work, we have considered different types and different amount of blur for the analysis. Mean Square Error (MSE) and Peak Signal to Noise Ration (PSNR) are used to evaluate the performance of the filters. The designed system has been implemented in Matlab software and tested for synthetic and real-time images.Keywords: image enhancement, motion analysis, motion detection, motion estimation
Procedia PDF Downloads 2871644 Statistical Model of Water Quality in Estero El Macho, Machala-El Oro
Authors: Rafael Zhindon Almeida
Abstract:
Surface water quality is an important concern for the evaluation and prediction of water quality conditions. The objective of this study is to develop a statistical model that can accurately predict the water quality of the El Macho estuary in the city of Machala, El Oro province. The methodology employed in this study is of a basic type that involves a thorough search for theoretical foundations to improve the understanding of statistical modeling for water quality analysis. The research design is correlational, using a multivariate statistical model involving multiple linear regression and principal component analysis. The results indicate that water quality parameters such as fecal coliforms, biochemical oxygen demand, chemical oxygen demand, iron and dissolved oxygen exceed the allowable limits. The water of the El Macho estuary is determined to be below the required water quality criteria. The multiple linear regression model, based on chemical oxygen demand and total dissolved solids, explains 99.9% of the variance of the dependent variable. In addition, principal component analysis shows that the model has an explanatory power of 86.242%. The study successfully developed a statistical model to evaluate the water quality of the El Macho estuary. The estuary did not meet the water quality criteria, with several parameters exceeding the allowable limits. The multiple linear regression model and principal component analysis provide valuable information on the relationship between the various water quality parameters. The findings of the study emphasize the need for immediate action to improve the water quality of the El Macho estuary to ensure the preservation and protection of this valuable natural resource.Keywords: statistical modeling, water quality, multiple linear regression, principal components, statistical models
Procedia PDF Downloads 981643 Application of Low-order Modeling Techniques and Neural-Network Based Models for System Identification
Authors: Venkatesh Pulletikurthi, Karthik B. Ariyur, Luciano Castillo
Abstract:
The system identification from the turbulence wakes will lead to the tactical advantage to prepare and also, to predict the trajectory of the opponents’ movements. A low-order modeling technique, POD, is used to predict the object based on the wake pattern and compared with pre-trained image recognition neural network (NN) to classify the wake patterns into objects. It is demonstrated that low-order modeling, POD, is able to predict the objects better compared to pretrained NN by ~30%.Keywords: the bluff body wakes, low-order modeling, neural network, system identification
Procedia PDF Downloads 1801642 A Principal-Agent Model for Sharing Mechanism in Integrated Project Delivery Context
Abstract:
Integrated project delivery (IPD) is a project delivery method distinguished by a shared risk/rewards mechanism and multiparty agreement. IPD has drawn increasingly attention from construction industry because of its efficiency of solving adversarial problems and reliability to deliver high-performing buildings. However, some evidence showed that some project participants obtained less profit from IPD projects than the typical projects. They attributed it to the unfair IPD sharing mechanism, which resulted in additional time and cost of negotiation on the sharing fractions among project participants. The study is aimed to investigate the reward distribution by constructing a principal-agent model. Based on cooperative game theory, it is examined how to distribute the shared project rewards between client and non-client parties, and identify the sharing fractions among non-client parties. It is found that at least half of the project savings should be allocated to the non-client parties to motivate them to create more project value. Second, the client should raise his sharing fractions when the integration among project participants is efficient. In addition, the client should allocate higher sharing fractions to the non-client party who is more able. This study can help the IPD project participants make fair and motivated sharing mechanisms.Keywords: cooperative game theory, IPD, principal agent model, sharing mechanism
Procedia PDF Downloads 2921641 Teachers' Perceptions of Their Principals' Interpersonal Emotionally Intelligent Behaviours Affecting Their Job Satisfaction
Authors: Prakash Singh
Abstract:
For schools to be desirable places in which to work, it is necessary for principals to recognise their teachers’ emotions, and be sensitive to their needs. This necessitates that principals are capable to correctly identify their emotionally intelligent behaviours (EIBs) they need to use in order to be successful leaders. They also need to have knowledge of their emotional intelligence and be able to identify the factors and situations that evoke emotion at an interpersonal level. If a principal is able to do this, then the control and understanding of emotions and behaviours of oneself and others could improve vastly. This study focuses on the interpersonal EIBS of principals affecting the job satisfaction of teachers. The correlation coefficients in this quantitative study strongly indicate that there is a statistical significance between the respondents’ level of job satisfaction, the rating of their principals’ EIBs and how they believe their principals’ EIBs will affect their sense of job satisfaction. It can be concluded from the data obtained in this study that there is a significant correlation between the sense of job satisfaction of teachers and their principals’ interpersonal EIBs. This means that the more satisfied a teacher is at school, the more appropriate and meaningful a principal’s EIBs will be. Conversely, the more dissatisfied a teacher is at school the less appropriate and less meaningful a principal’s interpersonal EIBs will be. This implies that the leaders’ EIBs can be construed as one of the major factors affecting the job satisfaction of employees.Keywords: emotional intelligence, teachers' emotions, teachers' job satisfaction, principals' emotionally intelligent behaviours
Procedia PDF Downloads 4721640 Identifying Missing Component in the Bechdel Test Using Principal Component Analysis Method
Authors: Raghav Lakhotia, Chandra Kanth Nagesh, Krishna Madgula
Abstract:
A lot has been said and discussed regarding the rationale and significance of the Bechdel Score. It became a digital sensation in 2013, when Swedish cinemas began to showcase the Bechdel test score of a film alongside its rating. The test has drawn criticism from experts and the film fraternity regarding its use to rate the female presence in a movie. The pundits believe that the score is too simplified and the underlying criteria of a film to pass the test must include 1) at least two women, 2) who have at least one dialogue, 3) about something other than a man, is egregious. In this research, we have considered a few more parameters which highlight how we represent females in film, like the number of female dialogues in a movie, dialogue genre, and part of speech tags in the dialogue. The parameters were missing in the existing criteria to calculate the Bechdel score. The research aims to analyze 342 movies scripts to test a hypothesis if these extra parameters, above with the current Bechdel criteria, are significant in calculating the female representation score. The result of the Principal Component Analysis method concludes that the female dialogue content is a key component and should be considered while measuring the representation of women in a work of fiction.Keywords: Bechdel test, dialogue genre, parts of speech tags, principal component analysis
Procedia PDF Downloads 1421639 Semantic Processing in Chinese: Category Effects, Task Effects and Age Effects
Authors: Yi-Hsiu Lai
Abstract:
The present study aimed to elucidate the nature of semantic processing in Chinese. Language and cognition related to the issue of aging are examined from the perspective of picture naming and category fluency tasks. Twenty Chinese-speaking adults (ranging from 25 to 45 years old) and twenty Chinese-speaking seniors (ranging from 65 to 75 years old) in Taiwan participated in this study. Each of them individually completed two tasks: a picture naming task and a category fluency task. Instruments for the naming task were sixty black-and-white pictures: thirty-five object and twenty-five action pictures. Category fluency task also consisted of two semantic categories – objects (or nouns) and actions (or verbs). Participants were asked to report as many items within a category as possible in one minute. Scores of action fluency and of object fluency were a summation of correct responses in these two categories. Category effects (actions vs. objects) and age effects were examined in these tasks. Objects were further divided into two major types: living objects and non-living objects. Actions were also categorized into two major types: action verbs and process verbs. Reaction time to each picture/question was additionally calculated and analyzed. Results of the category fluency task indicated that the content of information in Chinese seniors was comparatively deteriorated, thus producing smaller number of semantic-lexical items. Significant group difference was also found in the results of reaction time. Category Effect was significant for both Chinese adults and seniors in the semantic fluency task. Findings in the present study helped characterize the nature of semantic processing in Chinese-speaking adults and seniors and contributed to the issue of language and aging.Keywords: semantic processing, aging, Chinese, category effects
Procedia PDF Downloads 3611638 Design of an Acoustic Imaging Sensor Array for Mobile Robots
Authors: Dibyendu Roy, V. Ramu Reddy, Parijat Deshpande, Ranjan Dasgupta
Abstract:
Imaging of underwater objects is primarily conducted by acoustic imagery due to the severe attenuation of electro-magnetic waves in water. Acoustic imagery underwater has varied range of significant applications such as side-scan sonar, mine hunting sonar. It also finds utility in other domains such as imaging of body tissues via ultrasonography and non-destructive testing of objects. In this paper, we explore the feasibility of using active acoustic imagery in air and simulate phased array beamforming techniques available in literature for various array designs to achieve a suitable acoustic sensor array design for a portable mobile robot which can be applied to detect the presence/absence of anomalous objects in a room. The multi-path reflection effects especially in enclosed rooms and environmental noise factors are currently not simulated and will be dealt with during the experimental phase. The related hardware is designed with the same feasibility criterion that the developed system needs to be deployed on a portable mobile robot. There is a trade of between image resolution and range with the array size, number of elements and the imaging frequency and has to be iteratively simulated to achieve the desired acoustic sensor array design. The designed acoustic imaging array system is to be mounted on a portable mobile robot and targeted for use in surveillance missions for intruder alerts and imaging objects during dark and smoky scenarios where conventional optic based systems do not function well.Keywords: acoustic sensor array, acoustic imagery, anomaly detection, phased array beamforming
Procedia PDF Downloads 4091637 Object Detection Based on Plane Segmentation and Features Matching for a Service Robot
Authors: António J. R. Neves, Rui Garcia, Paulo Dias, Alina Trifan
Abstract:
With the aging of the world population and the continuous growth in technology, service robots are more and more explored nowadays as alternatives to healthcare givers or personal assistants for the elderly or disabled people. Any service robot should be capable of interacting with the human companion, receive commands, navigate through the environment, either known or unknown, and recognize objects. This paper proposes an approach for object recognition based on the use of depth information and color images for a service robot. We present a study on two of the most used methods for object detection, where 3D data is used to detect the position of objects to classify that are found on horizontal surfaces. Since most of the objects of interest accessible for service robots are on these surfaces, the proposed 3D segmentation reduces the processing time and simplifies the scene for object recognition. The first approach for object recognition is based on color histograms, while the second is based on the use of the SIFT and SURF feature descriptors. We present comparative experimental results obtained with a real service robot.Keywords: object detection, feature, descriptors, SIFT, SURF, depth images, service robots
Procedia PDF Downloads 5461636 Developing the Principal Change Leadership Non-Technical Competencies Scale: An Exploratory Factor Analysis
Authors: Tai Mei Kin, Omar Abdull Kareem
Abstract:
In light of globalization, educational reform has become a top priority for many countries. However, the task of leading change effectively requires a multidimensional set of competencies. Over the past two decades, technical competencies of principal change leadership have been extensively analysed and discussed. Comparatively, little research has been conducted in Malaysian education context on non-technical competencies or popularly known as emotional intelligence, which is equally crucial for the success of change. This article provides a validation of the Principal Change Leadership Non-Technical Competencies (PCLnTC) Scale, a tool that practitioners can easily use to assess school principals’ level of change leadership non-technical competencies that facilitate change and maximize change effectiveness. The overall coherence of the PCLnTC model was constructed by incorporating three theories: a)the change leadership theory whereby leading change is the fundamental role of a leader; b)competency theory in which leadership can be taught and learned; and c)the concept of emotional intelligence whereby it can be developed, fostered and taught. An exploratory factor analysis (EFA) was used to determine the underlying factor structure of PCLnTC model. Before conducting EFA, five important pilot test approaches were conducted to ensure the validity and reliability of the instrument: a)reviewed by academic colleagues; b)verification and comments from panel; c)evaluation on questionnaire format, syntax, design, and completion time; d)evaluation of item clarity; and e)assessment of internal consistency reliability. A total of 335 teachers from 12 High Performing Secondary School in Malaysia completed the survey. The PCLnTCS with six points Liker-type scale were subjected to Principal Components Analysis. The analysis yielded a three-factor solution namely, a)Interpersonal Sensitivity; b)Flexibility; and c)Motivation, explaining a total 74.326 per cent of the variance. Based on the results, implications for instrument revisions are discussed and specifications for future confirmatory factor analysis are delineated.Keywords: exploratory factor analysis, principal change leadership non-technical competencies (PCLnTC), interpersonal sensitivity, flexibility, motivation
Procedia PDF Downloads 4251635 Optical Variability of Faint Quasars
Authors: Kassa Endalamaw Rewnu
Abstract:
The variability properties of a quasar sample, spectroscopically complete to magnitude J = 22.0, are investigated on a time baseline of 2 years using three different photometric bands (U, J and F). The original sample was obtained using a combination of different selection criteria: colors, slitless spectroscopy and variability, based on a time baseline of 1 yr. The main goals of this work are two-fold: first, to derive the percentage of variable quasars on a relatively short time baseline; secondly, to search for new quasar candidates missed by the other selection criteria; and, thus, to estimate the completeness of the spectroscopic sample. In order to achieve these goals, we have extracted all the candidate variable objects from a sample of about 1800 stellar or quasi-stellar objects with limiting magnitude J = 22.50 over an area of about 0.50 deg2. We find that > 65% of all the objects selected as possible variables are either confirmed quasars or quasar candidates on the basis of their colors. This percentage increases even further if we exclude from our lists of variable candidates a number of objects equal to that expected on the basis of `contamination' induced by our photometric errors. The percentage of variable quasars in the spectroscopic sample is also high, reaching about 50%. On the basis of these results, we can estimate that the incompleteness of the original spectroscopic sample is < 12%. We conclude that variability analysis of data with small photometric errors can be successfully used as an efficient and independent (or at least auxiliary) selection method in quasar surveys, even when the time baseline is relatively short. Finally, when corrected for the different intrinsic time lags corresponding to a fixed observed time baseline, our data do not show a statistically significant correlation between variability and either absolute luminosity or redshift.Keywords: nuclear activity, galaxies, active quasars, variability
Procedia PDF Downloads 811634 Complex Technology of Virtual Reconstruction: The Case of Kazan Imperial University of XIX-Early XX Centuries
Authors: L. K. Karimova, K. I. Shariukova, A. A. Kirpichnikova, E. A. Razuvalova
Abstract:
This article deals with technology of virtual reconstruction of Kazan Imperial University of XIX - early XX centuries. The paper describes technologies of 3D-visualization of high-resolution models of objects of university space, creation of multi-agent system and connected with these objects organized database of historical sources, variants of use of technologies of immersion into the virtual environment.Keywords: 3D-reconstruction, multi-agent system, database, university space, virtual reconstruction, virtual heritage
Procedia PDF Downloads 2721633 Clustering Color Space, Time Interest Points for Moving Objects
Authors: Insaf Bellamine, Hamid Tairi
Abstract:
Detecting moving objects in sequences is an essential step for video analysis. This paper mainly contributes to the Color Space-Time Interest Points (CSTIP) extraction and detection. We propose a new method for detection of moving objects. Two main steps compose the proposed method. First, we suggest to apply the algorithm of the detection of Color Space-Time Interest Points (CSTIP) on both components of the Color Structure-Texture Image Decomposition which is based on a Partial Differential Equation (PDE): a color geometric structure component and a color texture component. A descriptor is associated to each of these points. In a second stage, we address the problem of grouping the points (CSTIP) into clusters. Experiments and comparison to other motion detection methods on challenging sequences show the performance of the proposed method and its utility for video analysis. Experimental results are obtained from very different types of videos, namely sport videos and animation movies.Keywords: Color Space-Time Interest Points (CSTIP), Color Structure-Texture Image Decomposition, Motion Detection, clustering
Procedia PDF Downloads 3781632 Principal Component Analysis Applied to the Electric Power Systems – Practical Guide; Practical Guide for Algorithms
Authors: John Morales, Eduardo Orduña
Abstract:
Currently the Principal Component Analysis (PCA) theory has been used to develop algorithms regarding to Electric Power Systems (EPS). In this context, this paper presents a practical tutorial of this technique detailed their concept, on-line and off-line mathematical foundations, which are necessary and desirables in EPS algorithms. Thus, features of their eigenvectors which are very useful to real-time process are explained, showing how it is possible to select these parameters through a direct optimization. On the other hand, in this work in order to show the application of PCA to off-line and on-line signals, an example step to step using Matlab commands is presented. Finally, a list of different approaches using PCA is presented, and some works which could be analyzed using this tutorial are presented.Keywords: practical guide; on-line; off-line, algorithms, faults
Procedia PDF Downloads 5631631 Principal Component Analysis in Drug-Excipient Interactions
Authors: Farzad Khajavi
Abstract:
Studies about the interaction between active pharmaceutical ingredients (API) and excipients are so important in the pre-formulation stage of development of all dosage forms. Analytical techniques such as differential scanning calorimetry (DSC), Thermal gravimetry (TG), and Furrier transform infrared spectroscopy (FTIR) are commonly used tools for investigating regarding compatibility and incompatibility of APIs with excipients. Sometimes the interpretation of data obtained from these techniques is difficult because of severe overlapping of API spectrum with excipients in their mixtures. Principal component analysis (PCA) as a powerful factor analytical method is used in these situations to resolve data matrices acquired from these analytical techniques. Binary mixtures of API and interested excipients are considered and produced. Peaks of FTIR, DSC, or TG of pure API and excipient and their mixtures at different mole ratios will construct the rows of the data matrix. By applying PCA on the data matrix, the number of principal components (PCs) is determined so that it contains the total variance of the data matrix. By plotting PCs or factors obtained from the score of the matrix in two-dimensional spaces if the pure API and its mixture with the excipient at the high amount of API and the 1:1mixture form a separate cluster and the other cluster comprise of the pure excipient and its blend with the API at the high amount of excipient. This confirms the existence of compatibility between API and the interested excipient. Otherwise, the incompatibility will overcome a mixture of API and excipient.Keywords: API, compatibility, DSC, TG, interactions
Procedia PDF Downloads 1321630 The Impact of Purpose as a Principal Leadership Skill on the Performance Select Township Schools in South Africa
Authors: Pepe Marais, Krishna Govender
Abstract:
This study aimed to investigate the impact of “purpose” as a principal leadership skill on the performance of two township schools using a quantitative research design and collecting data from the school principals, teachers and matric learners, using the 28-scale Servant Leadership Test as well as Gallup’s Q12 Employee Engagement survey. The questionnaires addressed the key objectives, namely, the extent to which the principals of the participating schools exhibited servant leadership and their understanding of “purpose” as one word in leadership and how teachers and learners perceived the impact of a “one-word” purpose-driven leader on the performance of the selected schools. Although no relationship could be demonstrated between ‘’purpose’’ and the performance of the two township schools, it became evident that a significant increase in Servant Leadership leads to a significant increase in engagement and performance, as measured by the matric pass rate. It is recommended that workshops be facilitated with principals and teachers in order to entrench ‘’purpose’’ deeper throughout the schools. In addition, Servant Leadership training has to be conduced to increase the leadership ability of the school principals. Future research in the area of ‘’purpose as one word’’, as well as Servant Leadership as a principal skillset within South Africa’s public school leadership, is recommended.Keywords: school leadership, servant leadership, one-word purpose, engagement, leadership
Procedia PDF Downloads 1251629 Recognition of Objects in a Maritime Environment Using a Combination of Pre- and Post-Processing of the Polynomial Fit Method
Authors: R. R. Hordijk, O. J. G. Somsen
Abstract:
Traditionally, radar systems are the eyes and ears of a ship. However, these systems have their drawbacks and nowadays they are extended with systems that work with video and photos. Processing of data from these videos and photos is however very labour-intensive and efforts are being made to automate this process. A major problem when trying to recognize objects in water is that the 'background' is not homogeneous so that traditional image recognition technics do not work well. Main question is, can a method be developed which automate this recognition process. There are a large number of parameters involved to facilitate the identification of objects on such images. One is varying the resolution. In this research, the resolution of some images has been reduced to the extreme value of 1% of the original to reduce clutter before the polynomial fit (pre-processing). It turned out that the searched object was clearly recognizable as its grey value was well above the average. Another approach is to take two images of the same scene shortly after each other and compare the result. Because the water (waves) fluctuates much faster than an object floating in the water one can expect that the object is the only stable item in the two images. Both these methods (pre-processing and comparing two images of the same scene) delivered useful results. Though it is too early to conclude that with these methods all image problems can be solved they are certainly worthwhile for further research.Keywords: image processing, image recognition, polynomial fit, water
Procedia PDF Downloads 5341628 The Power of the Proper Orthogonal Decomposition Method
Authors: Charles Lee
Abstract:
The Principal Orthogonal Decomposition (POD) technique has been used as a model reduction tool for many applications in engineering and science. In principle, one begins with an ensemble of data, called snapshots, collected from an experiment or laboratory results. The beauty of the POD technique is that when applied, the entire data set can be represented by the smallest number of orthogonal basis elements. It is the such capability that allows us to reduce the complexity and dimensions of many physical applications. Mathematical formulations and numerical schemes for the POD method will be discussed along with applications in NASA’s Deep Space Large Antenna Arrays, Satellite Image Reconstruction, Cancer Detection with DNA Microarray Data, Maximizing Stock Return, and Medical Imaging.Keywords: reduced-order methods, principal component analysis, cancer detection, image reconstruction, stock portfolios
Procedia PDF Downloads 84