Search results for: canonical objects
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 991

Search results for: canonical objects

511 A Collaborative Platform for Multilingual Ontology Development

Authors: Ahmed Tawfik, Fausto Giunchiglia, Vincenzo Maltese

Abstract:

Ontologies provide a common understanding of a specific domain of interest that can be communicated between people and used as background knowledge for automated reasoning in a wide range of applications. In this paper we address the design of multilingual ontologies following well-defined knowledge engineering methodologies with the support of novel collaborative development approaches. In particular, we present a collaborative platform which allows ontologies to be developed incrementally in multiple languages. This is made possible via an appropriate mapping between language independent concepts and one lexicalization per language (or a lexical gap in case such lexicalization does not exist). The collaborative platform has been designed to support the development of the Universal Knowledge Core, a multilingual ontology currently in English, Italian, Chinese, Mongolian, Hindi, and Bangladeshi. Its design follows a workflow-based development methodology that models resources as a set of collaborative objects and assigns customizable workflows to build and maintain each collaborative object in a community driven manner, with extensive support of modern web 2.0 social and collaborative features.

Keywords: knowledge diversity, knowledge representation, ontology, development

Procedia PDF Downloads 375
510 Evaluating the ‘Assembled Educator’ of a Specialized Postgraduate Engineering Course Using Activity Theory and Genre Ecologies

Authors: Simon Winberg

Abstract:

The landscape of professional postgraduate education is changing: the focus of these programmes is moving from preparing candidates for a life in academia towards a focus of training in expert knowledge and skills to support industry. This is especially pronounced in engineering disciplines where increasingly more complex products are drawing on a depth of knowledge from multiple fields. This connects strongly with the broader notion of Industry 4.0 – where technology and society are being brought together to achieve more powerful and desirable products, but products whose inner workings also are more complex than before. The changes in what we do, and how we do it, has a profound impact on what industry would like universities to provide. One such change is the increased demand for taught doctoral and Masters programmes. These programmes aim to provide skills and training for professionals, to expand their knowledge of state-of-the-art tools and technologies. This paper investigates one such course, namely a Software Defined Radio (SDR) Master’s degree course. The teaching support for this course had to be drawn from an existing pool of academics, none of who were specialists in this field. The paper focuses on the kind of educator, a ‘hybrid academic’, assembled from available academic staff and bolstered by research. The conceptual framework for this paper combines Activity Theory and Genre Ecology. Activity Theory is used to reason about learning and interactions during the course, and Genre Ecology is used to model building and sharing of technical knowledge related to using tools and artifacts. Data were obtained from meetings with students and lecturers, logs, project reports, and course evaluations. The findings show how the course, which was initially academically-oriented, metamorphosed into a tool-dominant peer-learning structure, largely supported by the sharing of technical tool-based knowledge. While the academic staff could address gaps in the participants’ fundamental knowledge of radio systems, the participants brought with them extensive specialized knowledge and tool experience which they shared with the class. This created a complicated dynamic in the class, which centered largely on engagements with technology artifacts, such as simulators, from which knowledge was built. The course was characterized by a richness of ‘epistemic objects’, which is to say objects that had knowledge-generating qualities. A significant portion of the course curriculum had to be adapted, and the learning methods changed to accommodate the dynamic interactions that occurred during classes. This paper explains the SDR Masters course in terms of conflicts and innovations in its activity system, as well as the continually hybridizing genre ecology to show how the structuring and resource-dependence of the course transformed from its initial ‘traditional’ academic structure to a more entangled arrangement over time. It is hoped that insights from this paper would benefit other educators involved in the design and teaching of similar types of specialized professional postgraduate taught programmes.

Keywords: professional postgraduate education, taught masters, engineering education, software defined radio

Procedia PDF Downloads 72
509 Visual and Clinical Outcome in Patients with Corneal Lacerations

Authors: Avantika Verma

Abstract:

In industrialized nations, corneal lacerations are one of the most common reason for hospitalization. This study was designed to study visual and clinical outcome in patients presenting with full thickness corneal lacerations in Indian population and to ascertain the impact of various preoperative and operative factors influencing prognosis after repair of corneal lacerations. Males in third decade with injuries at work with metallic objects were common. Lens damage, hyphema, vitreous hemorrhage, retinal detachment and endophthalmitis were seen. All the patients underwent primary repair within first 24 hours of presentation. At 3 months, 74.3% had a good visual outcome. About 5.7% of patients had no perception of light.In conclusion, various demographic and preoperative factors like age, time of presentation, vision at presentation, length of corneal wound, involvement of visual axis, associated ocular features like hyphaema, lenticular changes, vitreous haemorrhage and retinal detachment are significant prognostic indicators for final visual outcome.

Keywords: corneal laceration, corneal wound repair, injury, visual outcome

Procedia PDF Downloads 334
508 Semantic Platform for Adaptive and Collaborative e-Learning

Authors: Massra M. Sabeima, Myriam lamolle, Mohamedade Farouk Nanne

Abstract:

Adapting the learning resources of an e-learning system to the characteristics of the learners is an important aspect to consider when designing an adaptive e-learning system. However, this adaptation is not a simple process; it requires the extraction, analysis, and modeling of user information. This implies a good representation of the user's profile, which is the backbone of the adaptation process. Moreover, during the e-learning process, collaboration with similar users (same geographic province or knowledge context) is important. Productive collaboration motivates users to continue or not abandon the course and increases the assimilation of learning objects. The contribution of this work is the following: we propose an adaptive e-learning semantic platform to recommend learning resources to learners, using ontology to model the user profile and the course content, furthermore an implementation of a multi-agent system able to progressively generate the learning graph (taking into account the user's progress, and the changes that occur) for each user during the learning process, and to synchronize the users who collaborate on a learning object.

Keywords: adaptative learning, collaboration, multi-agent, ontology

Procedia PDF Downloads 157
507 The Laser Line Detection for Autonomous Mapping Based on Color Segmentation

Authors: Pavel Chmelar, Martin Dobrovolny

Abstract:

Laser projection or laser footprint detection is today widely used in many fields of robotics, measurement, or electronics. The system accuracy strictly depends on precise laser footprint detection on target objects. This article deals with the laser line detection based on the RGB segmentation and the component labeling. As a measurement device was used the developed optical rangefinder. The optical rangefinder is equipped with vertical sweeping of the laser beam and high quality camera. This system was developed mainly for automatic exploration and mapping of unknown spaces. In the first section is presented a new detection algorithm. In the second section are presented measurements results. The measurements were performed in variable light conditions in interiors. The last part of the article present achieved results and their differences between day and night measurements.

Keywords: color segmentation, component labelling, laser line detection, automatic mapping, distance measurement, vector map

Procedia PDF Downloads 415
506 An Intelligent Baby Care System Based on IoT and Deep Learning Techniques

Authors: Chinlun Lai, Lunjyh Jiang

Abstract:

Due to the heavy burden and pressure of caring for infants, an integrated automatic baby watching system based on IoT smart sensing and deep learning machine vision techniques is proposed in this paper. By monitoring infant body conditions such as heartbeat, breathing, body temperature, sleeping posture, as well as the surrounding conditions such as dangerous/sharp objects, light, noise, humidity and temperature, the proposed system can analyze and predict the obvious/potential dangerous conditions according to observed data and then adopt suitable actions in real time to protect the infant from harm. Thus, reducing the burden of the caregiver and improving safety efficiency of the caring work. The experimental results show that the proposed system works successfully for the infant care work and thus can be implemented in various life fields practically.

Keywords: baby care system, Internet of Things, deep learning, machine vision

Procedia PDF Downloads 212
505 Object Tracking in Motion Blurred Images with Adaptive Mean Shift and Wavelet Feature

Authors: Iman Iraei, Mina Sharifi

Abstract:

A method for object tracking in motion blurred images is proposed in this article. This paper shows that object tracking could be improved with this approach. We use mean shift algorithm to track different objects as a main tracker. But, the problem is that mean shift could not track the selected object accurately in blurred scenes. So, for better tracking result, and increasing the accuracy of tracking, wavelet transform is used. We use a feature named as blur extent, which could help us to get better results in tracking. For calculating of this feature, we should use Harr wavelet. We can look at this matter from two different angles which lead to determine whether an image is blurred or not and to what extent an image is blur. In fact, this feature left an impact on the covariance matrix of mean shift algorithm and cause to better performance of tracking. This method has been concentrated mostly on motion blur parameter. transform. The results reveal the ability of our method in order to reach more accurately tracking.

Keywords: mean shift, object tracking, blur extent, wavelet transform, motion blur

Procedia PDF Downloads 197
504 Utilizing the Principal Component Analysis on Multispectral Aerial Imagery for Identification of Underlying Structures

Authors: Marcos Bosques-Perez, Walter Izquierdo, Harold Martin, Liangdon Deng, Josue Rodriguez, Thony Yan, Mercedes Cabrerizo, Armando Barreto, Naphtali Rishe, Malek Adjouadi

Abstract:

Aerial imagery is a powerful tool when it comes to analyzing temporal changes in ecosystems and extracting valuable information from the observed scene. It allows us to identify and assess various elements such as objects, structures, textures, waterways, and shadows. To extract meaningful information, multispectral cameras capture data across different wavelength bands of the electromagnetic spectrum. In this study, the collected multispectral aerial images were subjected to principal component analysis (PCA) to identify independent and uncorrelated components or features that extend beyond the visible spectrum captured in standard RGB images. The results demonstrate that these principal components contain unique characteristics specific to certain wavebands, enabling effective object identification and image segmentation.

Keywords: big data, image processing, multispectral, principal component analysis

Procedia PDF Downloads 150
503 Locomotion, Object Exploration, Social Communicative Skills, and Improve in Language Abilities

Authors: Wanqing He

Abstract:

The current study explores aspects of exploratory behaviors and social capacities in urban Chinese infants to examine whether these factors mediate the link between infant walking and receptive and productive vocabularies. The linkage between the onset of walking and language attainment proves solid, but little is known about the factors that drive such link. This study examined whether joint attention, gesture use, and object activities mediate the association between locomotion and language development. Results showed that both the frequency (p = .05) and duration (p = .03) of carrying an object are strong mediators that afford opportunities for word comprehension. Also, accessing distal objects may be beneficial to infants’ language expression. Further studies on why object carrying may account for word comprehension and why infants with autism could not benefit from walking onset in terms of language development may yield valuable clinical implications.

Keywords: exploratory behaviors, infancy, language acquisition, motor development, social communicative skills

Procedia PDF Downloads 103
502 Spherical Nonlinear Wave Propagation in Relativistic Quantum Plasma

Authors: Alireza Abdikian

Abstract:

By assuming a quantum relativistic degenerate electron-positron (e-p) plasma media, the nonlinear acoustic solitary propagation in the presence of the stationary ions for neutralizing the plasma background of bounded cylindrical geometry was investigated. By using the standard reductive perturbation technique with cooperation the quantum hydrodynamics model for the e-p fluid, the spherical Kadomtsev-Petviashvili equation was derived for small but finite amplitude waves and was given the solitary wave solution for the parameters relevant for dense astrophysical objects such as white dwarf stars. By using a suitable coordinate transformation and using improved F-expansion technique, the SKP equation can be solved analytically. The numerical results reveal that the relativistic effects lead to propagate the electrostatic bell shape structures and by increasing the relativistic effects, the amplitude and the width of the e-p acoustic solitary wave will decrease.

Keywords: Electron-positron plasma, Acoustic solitary wave, Relativistic plasmas, the spherical Kadomtsev-Petviashvili equation

Procedia PDF Downloads 127
501 Triadic Relationship of Icon Design for Semi-Literate Communities

Authors: Peng-Hui Maffee Wan, Klarissa Ting Ting Chang, Rax Suen Chun Lung

Abstract:

Icons, or pictorial and graphical objects, are commonly used in Human-Computer Interaction (HCI) fields as the mediator in order to communicate information to users. Yet there has been little studies focusing on a majority of the world’s population, semi-literate communities, in terms of the fundamental know-how for designing icons for such population. In this study, two sets of icons belonging in different icon taxonomy, abstract and concrete are designed for a mobile application for semi-literate agricultural communities. In this paper, we propose a triadic relationship of an icon, namely meaning, task and mental image, which inherits the triadic relationship of a sign. User testing with the application and a post-pilot questionnaire are conducted as the experimental approach in two rural villages in India. Icons belonging to concrete taxonomy perform better than abstract icons on the premise that the design of the icon fulfills the underlying rules of the proposed triadic relationship.

Keywords: icon, GUI, mobile app, semi-literate

Procedia PDF Downloads 472
500 Quantum Localization of Vibrational Mirror in Cavity Optomechanics

Authors: Madiha Tariq, Hena Rabbani

Abstract:

Recently, cavity-optomechanics becomes an extensive research field that has manipulated the mechanical effects of light for coupling of the optical field with other physical objects specifically with regards to dynamical localization. We investigate the dynamical localization (both in momentum and position space) for a vibrational mirror in a Fabry-Pérot cavity driven by a single mode optical field and a transverse probe field. The weak probe field phenomenon results in classical chaos in phase space and spatio temporal dynamics in position |ψ(x)²| and momentum space |ψ(p)²| versus time show quantum localization in both momentum and position space. Also, we discuss the parametric dependencies of dynamical localization for a designated set of parameters to be experimentally feasible. Our work opens an avenue to manipulate the other optical phenomena and applicability of proposed work can be prolonged to turn-able laser sources in the future.

Keywords: dynamical localization, cavity optomechanics, Hamiltonian chaos, probe field

Procedia PDF Downloads 135
499 Monomial Form Approach to Rectangular Surface Modeling

Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong

Abstract:

Geometric modeling plays an important role in the constructions and manufacturing of curve, surface and solid modeling. Their algorithms are critically important not only in the automobile, ship and aircraft manufacturing business, but are also absolutely necessary in a wide variety of modern applications, e.g., robotics, optimization, computer vision, data analytics and visualization. The calculation and display of geometric objects can be accomplished by these six techniques: Polynomial basis, Recursive, Iterative, Coefficient matrix, Polar form approach and Pyramidal algorithms. In this research, the coefficient matrix (simply called monomial form approach) will be used to model polynomial rectangular patches, i.e., Said-Ball, Wang-Ball, DP, Dejdumrong and NB1 surfaces. Some examples of the monomial forms for these surface modeling are illustrated in many aspects, e.g., construction, derivatives, model transformation, degree elevation and degress reduction.

Keywords: monomial forms, rectangular surfaces, CAGD curves, monomial matrix applications

Procedia PDF Downloads 136
498 Automated Detection of Women Dehumanization in English Text

Authors: Maha Wiss, Wael Khreich

Abstract:

Animals, objects, foods, plants, and other non-human terms are commonly used as a source of metaphors to describe females in formal and slang language. Comparing women to non-human items not only reflects cultural views that might conceptualize women as subordinates or in a lower position than humans, yet it conveys this degradation to the listeners. Moreover, the dehumanizing representation of females in the language normalizes the derogation and even encourages sexism and aggressiveness against women. Although dehumanization has been a popular research topic for decades, according to our knowledge, no studies have linked women's dehumanizing language to the machine learning field. Therefore, we introduce our research work as one of the first attempts to create a tool for the automated detection of the dehumanizing depiction of females in English texts. We also present the first labeled dataset on the charted topic, which is used for training supervised machine learning algorithms to build an accurate classification model. The importance of this work is that it accomplishes the first step toward mitigating dehumanizing language against females.

Keywords: gender bias, machine learning, NLP, women dehumanization

Procedia PDF Downloads 66
497 Mathematical Knowledge a Prerequisite for Science Education Courses in Tertiary Institution

Authors: Esther Yemisi Akinjiola

Abstract:

Mathematics has been regarded as the backbone of science and technological development, without which no nation can achieve any sustainable growth and development. Mathematics is a useful tool to simplify science by quantification of phenomena; hence physics and chemistry cannot be done without Calculus and Statistics. Mathematics is used in physical science to calculate the measurement of objects and their characteristics, as well as to show the relationship between different functions and properties. Mathematics is the building block for everything in our daily lives, including the use of mobile devices, architecture design, ancient arts, engineering sports, and. among others. Therefore the study of Mathematics is made compulsory at primary, basic, and secondary school levels. Thus, this paper discusses the concepts of Mathematics, science, and their relationships. Also, it discusses Mathematics contents needed to study science-oriented courses such as physics education, chemistry education, and biology education in the tertiary institution. The paper concluded that without adequate knowledge of Mathematics, it will be difficult, if not impossible, for science education students to cope in their field of study.

Keywords: mathematical knowledge, prerequisite, science education, tertiary institution

Procedia PDF Downloads 72
496 Low Cost LiDAR-GNSS-UAV Technology Development for PT Garam’s Three Dimensional Stockpile Modeling Needs

Authors: Mohkammad Nur Cahyadi, Imam Wahyu Farid, Ronny Mardianto, Agung Budi Cahyono, Eko Yuli Handoko, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan

Abstract:

Unmanned aerial vehicle (UAV) technology has cost efficiency and data retrieval time advantages. Using technologies such as UAV, GNSS, and LiDAR will later be combined into one of the newest technologies to cover each other's deficiencies. This integration system aims to increase the accuracy of calculating the volume of the land stockpile of PT. Garam (Salt Company). The use of UAV applications to obtain geometric data and capture textures that characterize the structure of objects. This study uses the Taror 650 Iron Man drone with four propellers, which can fly for 15 minutes. LiDAR can classify based on the number of image acquisitions processed in the software, utilizing photogrammetry and structural science principles from Motion point cloud technology. LiDAR can perform data acquisition that enables the creation of point clouds, three-dimensional models, Digital Surface Models, Contours, and orthomosaics with high accuracy. LiDAR has a drawback in the form of coordinate data positions that have local references. Therefore, researchers use GNSS, LiDAR, and drone multi-sensor technology to map the stockpile of salt on open land and warehouses every year, carried out by PT. Garam twice, where the previous process used terrestrial methods and manual calculations with sacks. Research with LiDAR needs to be combined with UAV to overcome data acquisition limitations because it only passes through the right and left sides of the object, mainly when applied to a salt stockpile. The UAV is flown to assist data acquisition with a wide coverage with the help of integration of the 200-gram LiDAR system so that the flying angle taken can be optimal during the flight process. Using LiDAR for low-cost mapping surveys will make it easier for surveyors and academics to obtain pretty accurate data at a more economical price. As a survey tool, LiDAR is included in a tool with a low price, around 999 USD; this device can produce detailed data. Therefore, to minimize the operational costs of using LiDAR, surveyors can use Low-Cost LiDAR, GNSS, and UAV at a price of around 638 USD. The data generated by this sensor is in the form of a visualization of an object shape made in three dimensions. This study aims to combine Low-Cost GPS measurements with Low-Cost LiDAR, which are processed using free user software. GPS Low Cost generates data in the form of position-determining latitude and longitude coordinates. The data generates X, Y, and Z values to help georeferencing process the detected object. This research will also produce LiDAR, which can detect objects, including the height of the entire environment in that location. The results of the data obtained are calibrated with pitch, roll, and yaw to get the vertical height of the existing contours. This study conducted an experimental process on the roof of a building with a radius of approximately 30 meters.

Keywords: LiDAR, unmanned aerial vehicle, low-cost GNSS, contour

Procedia PDF Downloads 73
495 Magnetic Simulation of the Underground Electric Cable in the Presence of a Short Circuit and Harmonics

Authors: Ahmed Nour El Islam Ayad, Wafa Krika, Abdelghani Ayad, Moulay Larab, Houari Boudjella, Farid Benhamida

Abstract:

The purpose of this study is to evaluate the magnetic emission of underground electric cable of high voltage, because these power lines generate electromagnetic interaction with other objects near to it. The aim of this work shows a numerical simulation of the magnetic field of buried 400 kV line in three cases: permanent and transient states of short circuit and the last case with the presence of the harmonics at different positions as a function of time variation, with finite element resolution using Comsol Multiphysics software. The results obtained showed that the amplitude and distribution of the magnetic flux density change in the transient state and the presence of harmonics. The results of this work calculate the magnetic field generated by the underground lines in order to evaluate and know their impact on ecology and health.

Keywords: underground, electric power cables, cables crossing, harmonic, emission

Procedia PDF Downloads 211
494 Incorporation of Noncanonical Amino Acids into Hard-to-Express Antibody Fragments: Expression and Characterization

Authors: Hana Hanaee-Ahvaz, Monika Cserjan-Puschmann, Christopher Tauer, Gerald Striedner

Abstract:

Incorporation of noncanonical amino acids (ncAA) into proteins has become an interesting topic as proteins featured with ncAAs offer a wide range of different applications. Nowadays, technologies and systems exist that allow for the site-specific introduction of ncAAs in vivo, but the efficient production of proteins modified this way is still a big challenge. This is especially true for 'hard-to-express' proteins where low yields are encountered even with the native sequence. In this study, site-specific incorporation of azido-ethoxy-carbonyl-Lysin (azk) into an anti-tumor-necrosis-factor-α-Fab (FTN2) was investigated. According to well-established parameters, possible site positions for ncAA incorporation were determined, and corresponding FTN2 genes were constructed. Each of the modified FTN2 variants has one amber codon for azk incorporated either in its heavy or light chain. The expression level for all variants produced was determined by ELISA, and all azk variants could be produced with a satisfactory yield in the range of 50-70% of the original FTN2 variant. In terms of expression yield, neither the azk incorporation position nor the subunit modified (heavy or light chain) had a significant effect. We confirmed correct protein processing and azk incorporation by mass spectrometry analysis, and antigen-antibody interaction was determined by surface plasmon resonance analysis. The next step is to characterize the effect of azk incorporation on protein stability and aggregation tendency via differential scanning calorimetry and light scattering, respectively. In summary, the incorporation of ncAA into our Fab candidate FTN2 worked better than expected. The quantities produced allowed a detailed characterization of the variants in terms of their properties, and we can now turn our attention to potential applications. By using click chemistry, we can equip the Fabs with additional functionalities and make them suitable for a wide range of applications. We will now use this option in a first approach and develop an assay that will allow us to follow the degradation of the recombinant target protein in vivo. Special focus will be laid on the proteolytic activity in the periplasm and how it is influenced by cultivation/induction conditions.

Keywords: degradation, FTN2, hard-to-express protein, non-canonical amino acids

Procedia PDF Downloads 214
493 Detection and Classification Strabismus Using Convolutional Neural Network and Spatial Image Processing

Authors: Anoop T. R., Otman Basir, Robert F. Hess, Eileen E. Birch, Brooke A. Koritala, Reed M. Jost, Becky Luu, David Stager, Ben Thompson

Abstract:

Strabismus refers to a misalignment of the eyes. Early detection and treatment of strabismus in childhood can prevent the development of permanent vision loss due to abnormal development of visual brain areas. We developed a two-stage method for strabismus detection and classification based on photographs of the face. The first stage detects the presence or absence of strabismus, and the second stage classifies the type of strabismus. The first stage comprises face detection using Haar cascade, facial landmark estimation, face alignment, aligned face landmark detection, segmentation of the eye region, and detection of strabismus using VGG 16 convolution neural networks. Face alignment transforms the face to a canonical pose to ensure consistency in subsequent analysis. Using facial landmarks, the eye region is segmented from the aligned face and fed into a VGG 16 CNN model, which has been trained to classify strabismus. The CNN determines whether strabismus is present and classifies the type of strabismus (exotropia, esotropia, and vertical deviation). If stage 1 detects strabismus, the eye region image is fed into stage 2, which starts with the estimation of pupil center coordinates using mask R-CNN deep neural networks. Then, the distance between the pupil coordinates and eye landmarks is calculated along with the angle that the pupil coordinates make with the horizontal and vertical axis. The distance and angle information is used to characterize the degree and direction of the strabismic eye misalignment. This model was tested on 100 clinically labeled images of children with (n = 50) and without (n = 50) strabismus. The True Positive Rate (TPR) and False Positive Rate (FPR) of the first stage were 94% and 6% respectively. The classification stage has produced a TPR of 94.73%, 94.44%, and 100% for esotropia, exotropia, and vertical deviations, respectively. This method also had an FPR of 5.26%, 5.55%, and 0% for esotropia, exotropia, and vertical deviation, respectively. The addition of one more feature related to the location of corneal light reflections may reduce the FPR, which was primarily due to children with pseudo-strabismus (the appearance of strabismus due to a wide nasal bridge or skin folds on the nasal side of the eyes).

Keywords: strabismus, deep neural networks, face detection, facial landmarks, face alignment, segmentation, VGG 16, mask R-CNN, pupil coordinates, angle deviation, horizontal and vertical deviation

Procedia PDF Downloads 64
492 Mobile Robot Manipulator Kinematics Motion Control Analysis with MATLAB/Simulink

Authors: Wayan Widhiada, Cok Indra Partha, Gusti Ngurah Nitya Santhiarsa

Abstract:

The purpose of this paper is to investigate the sophistication of the use of Proportional Integral and Derivative Control to control the kinematic motion of the mobile robot manipulator. Simulation and experimental methods will be used to investigate the sophistication of PID control to control the mobile robot arm in the collection and placement of several kinds of objects quickly, accurately and correctly. Mathematical modeling will be done by utilizing the integration of Solidworks and MATLAB / Simmechanics software. This method works by converting the physical model file into the xml file. This method is easy, fast and accurate done in modeling and design robotics. The automatic control design of this robot manipulator will be validated in simulations and experimental in control labs as evidence that the mobile robot manipulator gripper control design can achieve the best performance such as the error signal is lower than 5%, small overshoot and get steady signal response as quickly.

Keywords: control analysis, kinematics motion, mobile robot manipulator, performance

Procedia PDF Downloads 389
491 Concepts of Modern Design: A Study of Art and Architecture Synergies in Early 20ᵗʰ Century Europe

Authors: Stanley Russell

Abstract:

Until the end of the 19th century, European painting dealt almost exclusively with the realistic representation of objects and landscapes, as can be seen in the work of realist artists like Gustav Courbet. Architects of the day typically made reference to and recreated historical precedents in their designs. The curriculum of the first architecture school in Europe, The Ecole des Beaux Artes, based on the study of classical buildings, had a profound effect on the profession. Painting exhibited an increasing level of abstraction from the late 19th century, with impressionism, and the trend continued into the early 20th century when Cubism had an explosive effect sending shock waves through the art world that also extended into the realm of architectural design. Architect /painter Le Corbusier with “Purism” was one of the first to integrate abstract painting and building design theory in works that were equally shocking to the architecture world. The interrelationship of the arts, including architecture, was institutionalized in the Bauhaus curriculum that sought to find commonality between diverse art disciplines. Renowned painter and Bauhaus instructor Vassily Kandinsky was one of the first artists to make a semi-scientific analysis of the elements in “non-objective” painting while also drawing parallels between painting and architecture in his book Point and Line to plane. Russian constructivists made abstract compositions with simple geometric forms, and like the De Stijl group of the Netherlands, they also experimented with full-scale constructions and spatial explorations. Based on the study of historical accounts and original artworks, of Impressionism, Cubism, the Bauhaus, De Stijl, and Russian Constructivism, this paper begins with a thorough explanation of the art theory and several key works from these important art movements of the late 19th and early 20th century. Similarly, based on written histories and first-hand experience of built and drawn works, the author continues with an analysis of the theories and architectural works generated by the same groups, all of which actively pursued continuity between their art and architectural concepts. With images of specific works, the author shows how the trend toward abstraction and geometric purity in painting coincided with a similar trend in architecture that favored simple unornamented geometries. Using examples like the Villa Savoye, The Schroeder House, the Dessau Bauhaus, and unbuilt designs by Russian architect Chernikov, the author gives detailed examples of how the intersection of trends in Art and Architecture led to a unique and fruitful period of creative synergy when the same concepts that were used by artists to generate paintings were also used by architects in the making of objects, space, and buildings. In Conclusion, this article examines the extremely pivotal period in art and architecture history from the late 19th to early 20th century when the confluence of art and architectural theory led to many painted, drawn, and built works that continue to inspire architects and artists to this day.

Keywords: modern art, architecture, design methodologies, modern architecture

Procedia PDF Downloads 108
490 Generalized π-Armendariz Authentication Cryptosystem

Authors: Areej M. Abduldaim, Nadia M. G. Al-Saidi

Abstract:

Algebra is one of the important fields of mathematics. It concerns with the study and manipulation of mathematical symbols. It also concerns with the study of abstractions such as groups, rings, and fields. Due to the development of these abstractions, it is extended to consider other structures, such as vectors, matrices, and polynomials, which are non-numerical objects. Computer algebra is the implementation of algebraic methods as algorithms and computer programs. Recently, many algebraic cryptosystem protocols are based on non-commutative algebraic structures, such as authentication, key exchange, and encryption-decryption processes are adopted. Cryptography is the science that aimed at sending the information through public channels in such a way that only an authorized recipient can read it. Ring theory is the most attractive category of algebra in the area of cryptography. In this paper, we employ the algebraic structure called skew -Armendariz rings to design a neoteric algorithm for zero knowledge proof. The proposed protocol is established and illustrated through numerical example, and its soundness and completeness are proved.

Keywords: cryptosystem, identification, skew π-Armendariz rings, skew polynomial rings, zero knowledge protocol

Procedia PDF Downloads 198
489 Vegetation Assessment Under the Influence of Environmental Variables; A Case Study from the Yakhtangay Hill of Himalayan Range, Pakistan

Authors: Hameed Ullah, Shujaul Mulk Khan, Zahid Ullah, Zeeshan Ahmad Sadia Jahangir, Abdullah, Amin Ur Rahman, Muhammad Suliman, Dost Muhammad

Abstract:

The interrelationship between vegetation and abiotic variables inside an ecosystem is one of the main jobs of plant scientists. This study was designed to investigate the vegetation structure and species diversity along with the environmental variables in the Yakhtangay hill district Shangla of the Himalayan Mountain series Pakistan by using multivariate statistical analysis. Quadrat’s method was used and a total of 171 Quadrats were laid down 57 for Tree, Shrubs and Herbs, respectively, to analyze the phytosociological attributes of the vegetation. The vegetation of the selected area was classified into different Life and leaf-forms according to Raunkiaer classification, while PCORD software version 5 was used to classify the vegetation into different plants communities by Two-way indicator species Analysis (TWINSPAN). The CANOCCO version 4.5 was used for DCA and CCA analysis to find out variation directories of vegetation with different environmental variables. A total of 114 plants species belonging to 45 different families was investigated inside the area. The Rosaceae (12 species) was the dominant family followed by Poaceae (10 species) and then Asteraceae (7 species). Monocots were more dominant than Dicots and Angiosperms were more dominant than Gymnosperms. Among the life forms the Hemicryptophytes and Nanophanerophytes were dominant, followed by Therophytes, while among the leaf forms Microphylls were dominant, followed by Leptophylls. It is concluded that among the edaphic factors such as soil pH, the concentration of soil organic matter, Calcium Carbonates concentration in soil, soil EC, soil TDS, and physiographic factors such as Altitude and slope are affecting the structure of vegetation, species composition and species diversity at the significant level with p-value ≤0.05. The Vegetation of the selected area was classified into four major plants communities and the indicator species for each community was recorded. Classification of plants into 4 different communities based upon edaphic gradients favors the individualistic hypothesis. Indicator Species Analysis (ISA) shows the indicators of the study area are mostly indicators to the Himalayan or moist temperate ecosystem, furthermore, these indicators could be considered for micro-habitat conservation and respective ecosystem management plans.

Keywords: species richness, edaphic gradients, canonical correspondence analysis (CCA), TWCA

Procedia PDF Downloads 131
488 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media

Authors: Jinghui Peng, Shanyu Tang, Jia Li

Abstract:

Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.

Keywords: steganalysis, security, Fast Fourier Transform, streaming media

Procedia PDF Downloads 128
487 Application of Advanced Remote Sensing Data in Mineral Exploration in the Vicinity of Heavy Dense Forest Cover Area of Jharkhand and Odisha State Mining Area

Authors: Hemant Kumar, R. N. K. Sharma, A. P. Krishna

Abstract:

The study has been carried out on the Saranda in Jharkhand and a part of Odisha state. Geospatial data of Hyperion, a remote sensing satellite, have been used. This study has used a wide variety of patterns related to image processing to enhance and extract the mining class of Fe and Mn ores.Landsat-8, OLI sensor data have also been used to correctly explore related minerals. In this way, various processes have been applied to increase the mineralogy class and comparative evaluation with related frequency done. The Hyperion dataset for hyperspectral remote sensing has been specifically verified as an effective tool for mineral or rock information extraction within the band range of shortwave infrared used. The abundant spatial and spectral information contained in hyperspectral images enables the differentiation of different objects of any object into targeted applications for exploration such as exploration detection, mining.

Keywords: Hyperion, hyperspectral, sensor, Landsat-8

Procedia PDF Downloads 105
486 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques

Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev

Abstract:

Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.

Keywords: data analysis, demand modeling, healthcare, medical facilities

Procedia PDF Downloads 130
485 An Alternative Credit Scoring System in China’s Consumer Lendingmarket: A System Based on Digital Footprint Data

Authors: Minjuan Sun

Abstract:

Ever since the late 1990s, China has experienced explosive growth in consumer lending, especially in short-term consumer loans, among which, the growth rate of non-bank lending has surpassed bank lending due to the development in financial technology. On the other hand, China does not have a universal credit scoring and registration system that can guide lenders during the processes of credit evaluation and risk control, for example, an individual’s bank credit records are not available for online lenders to see and vice versa. Given this context, the purpose of this paper is three-fold. First, we explore if and how alternative digital footprint data can be utilized to assess borrower’s creditworthiness. Then, we perform a comparative analysis of machine learning methods for the canonical problem of credit default prediction. Finally, we analyze, from an institutional point of view, the necessity of establishing a viable and nationally universal credit registration and scoring system utilizing online digital footprints, so that more people in China can have better access to the consumption loan market. Two different types of digital footprint data are utilized to match with bank’s loan default records. Each separately captures distinct dimensions of a person’s characteristics, such as his shopping patterns and certain aspects of his personality or inferred demographics revealed by social media features like profile image and nickname. We find both datasets can generate either acceptable or excellent prediction results, and different types of data tend to complement each other to get better performances. Typically, the traditional types of data banks normally use like income, occupation, and credit history, update over longer cycles, hence they can’t reflect more immediate changes, like the financial status changes caused by the business crisis; whereas digital footprints can update daily, weekly, or monthly, thus capable of providing a more comprehensive profile of the borrower’s credit capabilities and risks. From the empirical and quantitative examination, we believe digital footprints can become an alternative information source for creditworthiness assessment, because of their near-universal data coverage, and because they can by and large resolve the "thin-file" issue, due to the fact that digital footprints come in much larger volume and higher frequency.

Keywords: credit score, digital footprint, Fintech, machine learning

Procedia PDF Downloads 143
484 Developing Heat-Power Efficiency Criteria for Characterization of Technosphere Structural Elements

Authors: Victoria Y. Garnova, Vladimir G. Merzlikin, Sergey V. Khudyakov, Aleksandr A. Gajour, Andrei P. Garnov

Abstract:

This paper refers to the analysis of the characteristics of industrial and lifestyle facilities heat- energy objects as a part of the thermal envelope of Earth's surface for inclusion in any database of economic forecasting. The idealized model of the Earth's surface is discussed. This model gives the opportunity to obtain the energy equivalent for each element of terrain and world ocean. Energy efficiency criterion of comfortable human existence is introduced. Dynamics of changes of this criterion offers the possibility to simulate the possible technogenic catastrophes with a spontaneous industrial development of the certain Earth areas. Calculated model with the confirmed forecast of the Gulf Stream freezing in the Polar Regions in 2011 due to the heat-energy balance disturbance for the oceanic subsurface oil polluted layer is given. Two opposing trends of human development under the limited and unlimited amount of heat-energy resources are analyzed.

Keywords: Earth's surface, heat-energy consumption, energy criteria, technogenic catastrophes

Procedia PDF Downloads 305
483 A Comprehensive Study of Camouflaged Object Detection Using Deep Learning

Authors: Khalak Bin Khair, Saqib Jahir, Mohammed Ibrahim, Fahad Bin, Debajyoti Karmaker

Abstract:

Object detection is a computer technology that deals with searching through digital images and videos for occurrences of semantic elements of a particular class. It is associated with image processing and computer vision. On top of object detection, we detect camouflage objects within an image using Deep Learning techniques. Deep learning may be a subset of machine learning that's essentially a three-layer neural network Over 6500 images that possess camouflage properties are gathered from various internet sources and divided into 4 categories to compare the result. Those images are labeled and then trained and tested using vgg16 architecture on the jupyter notebook using the TensorFlow platform. The architecture is further customized using Transfer Learning. Methods for transferring information from one or more of these source tasks to increase learning in a related target task are created through transfer learning. The purpose of this transfer of learning methodologies is to aid in the evolution of machine learning to the point where it is as efficient as human learning.

Keywords: deep learning, transfer learning, TensorFlow, camouflage, object detection, architecture, accuracy, model, VGG16

Procedia PDF Downloads 122
482 Metareasoning Image Optimization Q-Learning

Authors: Mahasa Zahirnia

Abstract:

The purpose of this paper is to explore new and effective ways of optimizing satellite images using artificial intelligence, and the process of implementing reinforcement learning to enhance the quality of data captured within the image. In our implementation of Bellman's Reinforcement Learning equations, associated state diagrams, and multi-stage image processing, we were able to enhance image quality, detect and define objects. Reinforcement learning is the differentiator in the area of artificial intelligence, and Q-Learning relies on trial and error to achieve its goals. The reward system that is embedded in Q-Learning allows the agent to self-evaluate its performance and decide on the best possible course of action based on the current and future environment. Results show that within a simulated environment, built on the images that are commercially available, the rate of detection was 40-90%. Reinforcement learning through Q-Learning algorithm is not just desired but required design criteria for image optimization and enhancements. The proposed methods presented are a cost effective method of resolving uncertainty of the data because reinforcement learning finds ideal policies to manage the process using a smaller sample of images.

Keywords: Q-learning, image optimization, reinforcement learning, Markov decision process

Procedia PDF Downloads 196