Search results for: SURF features
3152 Identification of Watershed Landscape Character Types in Middle Yangtze River within Wuhan Metropolitan Area
Authors: Huijie Wang, Bin Zhang
Abstract:
In China, the middle reaches of the Yangtze River are well-developed, boasting a wealth of different types of watershed landscape. In this regard, landscape character assessment (LCA) can serve as a basis for protection, management and planning of trans-regional watershed landscape types. For this study, we chose the middle reaches of the Yangtze River in Wuhan metropolitan area as our study site, wherein the water system consists of rich variety in landscape types. We analyzed trans-regional data to cluster and identify types of landscape characteristics at two levels. 55 basins were analyzed as variables with topography, land cover and river system features in order to identify the watershed landscape character types. For watershed landscape, drainage density and degree of curvature were specified as special variables to directly reflect the regional differences of river system features. Then, we used the principal component analysis (PCA) method and hierarchical clustering algorithm based on the geographic information system (GIS) and statistical products and services solution (SPSS) to obtain results for clusters of watershed landscape which were divided into 8 characteristic groups. These groups highlighted watershed landscape characteristics of different river systems as well as key landscape characteristics that can serve as a basis for targeted protection of watershed landscape characteristics, thus helping to rationally develop multi-value landscape resources and promote coordinated development of trans-regions.Keywords: GIS, hierarchical clustering, landscape character, landscape typology, principal component analysis, watershed
Procedia PDF Downloads 2333151 HLB Disease Detection in Omani Lime Trees using Hyperspectral Imaging Based Techniques
Authors: Jacintha Menezes, Ramalingam Dharmalingam, Palaiahnakote Shivakumara
Abstract:
In the recent years, Omani acid lime cultivation and production has been affected by Citrus greening or Huanglongbing (HLB) disease. HLB disease is one of the most destructive diseases for citrus, with no remedies or countermeasures to stop the disease. Currently used Polymerase chain reaction (PCR) and enzyme-linked immunosorbent assay (ELISA) HLB detection tests require lengthy and labor-intensive laboratory procedures. Furthermore, the equipment and staff needed to carry out the laboratory procedures are frequently specialized hence making them a less optimal solution for the detection of the disease. The current research uses hyperspectral imaging technology for automatic detection of citrus trees with HLB disease. Omani citrus tree leaf images were captured through portable Specim IQ hyperspectral camera. The research considered healthy, nutrition deficient, and HLB infected leaf samples based on the Polymerase chain reaction (PCR) test. The highresolution image samples were sliced to into sub cubes. The sub cubes were further processed to obtain RGB images with spatial features. Similarly, RGB spectral slices were obtained through a moving window on the wavelength. The resized spectral-Spatial RGB images were given to Convolution Neural Networks for deep features extraction. The current research was able to classify a given sample to the appropriate class with 92.86% accuracy indicating the effectiveness of the proposed techniques. The significant bands with a difference in three types of leaves are found to be 560nm, 678nm, 726 nm and 750nm.Keywords: huanglongbing (HLB), hyperspectral imaging (HSI), · omani citrus, CNN
Procedia PDF Downloads 813150 A Modified Nonlinear Conjugate Gradient Algorithm for Large Scale Unconstrained Optimization Problems
Authors: Tsegay Giday Woldu, Haibin Zhang, Xin Zhang, Yemane Hailu Fissuh
Abstract:
It is well known that nonlinear conjugate gradient method is one of the widely used first order methods to solve large scale unconstrained smooth optimization problems. Because of the low memory requirement, attractive theoretical features, practical computational efficiency and nice convergence properties, nonlinear conjugate gradient methods have a special role for solving large scale unconstrained optimization problems. Large scale optimization problems are with important applications in practical and scientific world. However, nonlinear conjugate gradient methods have restricted information about the curvature of the objective function and they are likely less efficient and robust compared to some second order algorithms. To overcome these drawbacks, the new modified nonlinear conjugate gradient method is presented. The noticeable features of our work are that the new search direction possesses the sufficient descent property independent of any line search and it belongs to a trust region. Under mild assumptions and standard Wolfe line search technique, the global convergence property of the proposed algorithm is established. Furthermore, to test the practical computational performance of our new algorithm, numerical experiments are provided and implemented on the set of some large dimensional unconstrained problems. The numerical results show that the proposed algorithm is an efficient and robust compared with other similar algorithms.Keywords: conjugate gradient method, global convergence, large scale optimization, sufficient descent property
Procedia PDF Downloads 2083149 On Early Verb Acquisition in Chinese-Speaking Children
Authors: Yating Mu
Abstract:
Young children acquire native language with amazing rapidity. After noticing this interesting phenomenon, lots of linguistics, as well as psychologists, devote themselves to exploring the best explanations. Thus researches on first language acquisition emerged. Early lexical development is an important branch of children’s FLA (first language acquisition). Verb, the most significant class of lexicon, the most grammatically complex syntactic category or word type, is not only the core of exploring syntactic structures of language but also plays a key role in analyzing semantic features. Obviously, early verb development must have great impacts on children’s early lexical acquisition. Most scholars conclude that verbs, in general, are very difficult to learn because the problem in verb learning might be more about mapping a specific verb onto an action or event than about learning the underlying relational concepts that the verb or relational term encodes. However, the previous researches on early verb development mainly focus on the argument about whether there is a noun-bias or verb-bias in children’s early productive vocabulary. There are few researches on general characteristics of children’s early verbs concerning both semantic and syntactic aspects, not mentioning a general survey on Chinese-speaking children’s verb acquisition. Therefore, the author attempts to examine the general conditions and characteristics of Chinese-speaking children’s early productive verbs, based on data from a longitudinal study on three Chinese-speaking children. In order to present an overall picture of Chinese verb development, both semantic and syntactic aspects will be focused in the present study. As for semantic analysis, a classification method is adopted first. Verb category is a sophisticated class in Mandarin, so it is quite necessary to divide it into small sub-types, thus making the research much easier. By making a reasonable classification of eight verb classes on basis of semantic features, the research aims at finding out whether there exist any universal rules in Chinese-speaking children’s verb development. With regard to the syntactic aspect of verb category, a debate between nativist account and usage-based approach has lasted for quite a long time. By analyzing the longitudinal Mandarin data, the author attempts to find out whether the usage-based theory can fully explain characteristics in Chinese verb development. To sum up, this thesis attempts to apply the descriptive research method to investigate the acquisition and the usage of Chinese-speaking children’s early verbs, on purpose of providing a new perspective in investigating semantic and syntactic features of early verb acquisition.Keywords: Chinese-speaking children, early verb acquisition, verb classes, verb grammatical structures
Procedia PDF Downloads 3673148 Advanced Magnetic Resonance Imaging in Differentiation of Neurocysticercosis and Tuberculoma
Authors: Rajendra N. Ghosh, Paramjeet Singh, Niranjan Khandelwal, Sameer Vyas, Pratibha Singhi, Naveen Sankhyan
Abstract:
Background: Tuberculoma and neurocysticercosis (NCC) are two most common intracranial infections in developing country. They often simulate on neuroimaging and in absence of typical imaging features cause significant diagnostic dilemmas. Differentiation is extremely important to avoid empirical exposure to antitubercular medications or nonspecific treatment causing disease progression. Purpose: Better characterization and differentiation of CNS tuberculoma and NCC by using morphological and multiple advanced functional MRI. Material and Methods: Total fifty untreated patients (20 tuberculoma and 30 NCC) were evaluated by using conventional and advanced sequences like CISS, SWI, DWI, DTI, Magnetization transfer (MT), T2Relaxometry (T2R), Perfusion and Spectroscopy. rCBV,ADC,FA,T2R,MTR values and metabolite ratios were calculated from lesion and normal parenchyma. Diagnosis was confirmed by typical biochemical, histopathological and imaging features. Results: CISS was most useful sequence for scolex detection (90% on CISS vs 73% on routine sequences). SWI showed higher scolex detection ability. Mean values of ADC, FA,T2R from core and rCBV from wall of lesion were significantly different in tuberculoma and NCC (P < 0.05). Mean values of rCBV, ADC, T2R and FA for tuberculoma and NCC were (3.36 vs1.3), (1.09x10⁻³vs 1.4x10⁻³), (0.13 x10⁻³ vs 0.09 x10⁻³) and (88.65 ms vs 272.3 ms) respectively. Tuberculomas showed high lipid peak, more choline and lower creatinine with Ch/Cr ratio > 1. T2R value was most significant parameter for differentiation. Cut off values for each significant parameters have proposed. Conclusion: Quantitative MRI in combination with conventional sequences can better characterize and differentiate similar appearing tuberculoma and NCC and may be incorporated in routine protocol which may avoid brain biopsy and empirical therapy.Keywords: advanced functional MRI, differentiation, neurcysticercosis, tuberculoma
Procedia PDF Downloads 5683147 Landslide Hazard Zonation Using Satellite Remote Sensing and GIS Technology
Authors: Ankit Tyagi, Reet Kamal Tiwari, Naveen James
Abstract:
Landslide is the major geo-environmental problem of Himalaya because of high ridges, steep slopes, deep valleys, and complex system of streams. They are mainly triggered by rainfall and earthquake and causing severe damage to life and property. In Uttarakhand, the Tehri reservoir rim area, which is situated in the lesser Himalaya of Garhwal hills, was selected for landslide hazard zonation (LHZ). The study utilized different types of data, including geological maps, topographic maps from the survey of India, Landsat 8, and Cartosat DEM data. This paper presents the use of a weighted overlay method in LHZ using fourteen causative factors. The various data layers generated and co-registered were slope, aspect, relative relief, soil cover, intensity of rainfall, seismic ground shaking, seismic amplification at surface level, lithology, land use/land cover (LULC), normalized difference vegetation index (NDVI), topographic wetness index (TWI), stream power index (SPI), drainage buffer and reservoir buffer. Seismic analysis is performed using peak horizontal acceleration (PHA) intensity and amplification factors in the evaluation of the landslide hazard index (LHI). Several digital image processing techniques such as topographic correction, NDVI, and supervised classification were widely used in the process of terrain factor extraction. Lithological features, LULC, drainage pattern, lineaments, and structural features are extracted using digital image processing techniques. Colour, tones, topography, and stream drainage pattern from the imageries are used to analyse geological features. Slope map, aspect map, relative relief are created by using Cartosat DEM data. DEM data is also used for the detailed drainage analysis, which includes TWI, SPI, drainage buffer, and reservoir buffer. In the weighted overlay method, the comparative importance of several causative factors obtained from experience. In this method, after multiplying the influence factor with the corresponding rating of a particular class, it is reclassified, and the LHZ map is prepared. Further, based on the land-use map developed from remote sensing images, a landslide vulnerability study for the study area is carried out and presented in this paper.Keywords: weighted overlay method, GIS, landslide hazard zonation, remote sensing
Procedia PDF Downloads 1343146 Maximizing Profit Using Optimal Control by Exploiting the Flexibility in Thermal Power Plants
Authors: Daud Mustafa Minhas, Raja Rehan Khalid, Georg Frey
Abstract:
The next generation power systems are equipped with abundantly available free renewable energy resources (RES). During their low-cost operations, the price of electricity significantly reduces to a lower value, and sometimes it becomes negative. Therefore, it is recommended not to operate the traditional power plants (e.g. coal power plants) and to reduce the losses. In fact, it is not a cost-effective solution, because these power plants exhibit some shutdown and startup costs. Moreover, they require certain time for shutdown and also need enough pause before starting up again, increasing inefficiency in the whole power network. Hence, there is always a trade-off between avoiding negative electricity prices, and the startup costs of power plants. To exploit this trade-off and to increase the profit of a power plant, two main contributions are made: 1) introducing retrofit technology for state of art coal power plant; 2) proposing optimal control strategy for a power plant by exploiting different flexibility features. These flexibility features include: improving ramp rate of power plant, reducing startup time and lowering minimum load. While, the control strategy is solved as mixed integer linear programming (MILP), ensuring optimal solution for the profit maximization problem. Extensive comparisons are made considering pre and post-retrofit coal power plant having the same efficiencies under different electricity price scenarios. It concludes that if the power plant must remain in the market (providing services), more flexibility reflects direct economic advantage to the plant operator.Keywords: discrete optimization, power plant flexibility, profit maximization, unit commitment model
Procedia PDF Downloads 1443145 Natural Patterns for Sustainable Cooling in the Architecture of Residential Buildings in Iran (Hot and Dry Climate)
Authors: Elnaz Abbasian, Mohsen Faizi
Abstract:
In its thousand-year development, architecture has gained valuable patterns. Iran’s desert regions possess developed patterns of traditional architecture and outstanding skeletal features. Unfortunately increasing population and urbanization growth in the past decade as well as the lack of harmony with environment’s texture has destroyed such permanent concepts in the building’s skeleton, causing a lot of energy waste in the modern architecture. The important question is how cooling patterns of Iran’s traditional architecture can be used in a new way in the modern architecture of residential buildings? This research is library-based and documental that looks at sustainable development, analyzes the features of Iranian architecture in hot and dry climate in terms of sustainability as well as historical patterns, and makes a model for real environment. By methodological analysis of past, it intends to suggest a new pattern for residential buildings’ cooling in Iran’s hot and dry climate which is in full accordance to the ecology of the design and at the same time possesses the architectural indices of the past. In the process of cities’ physical development, ecological measures, in proportion to desert’s natural background and climate conditions, has kept the natural fences, preventing buildings from facing climate adversities. Designing and construction of buildings with this viewpoint can reduce the energy needed for maintaining and regulating environmental conditions and with the use of appropriate building technology help minimizing the consumption of fossil fuels while having permanent patterns of desert buildings’ architecture.Keywords: sustainability concepts, sustainable development, energy climate architecture, fossil fuel, hot and dry climate, patterns of traditional sustainability for residential buildings, modern pattern of cooling
Procedia PDF Downloads 3093144 Designing of Content Management Systems (CMS) for Web Development
Authors: Abdul Basit Kiani, Maryam Kiani
Abstract:
Content Management Systems (CMS) have transformed the landscape of web development by providing an accessible and efficient platform for creating and managing digital content. This abstract explores the key features and benefits of CMS in web development, highlighting its impact on website creation and maintenance. CMS offers a user-friendly interface that empowers individuals to create, edit, and publish content without requiring extensive technical knowledge. With customizable templates and themes, users can personalize the design and layout of their websites, ensuring a visually appealing online presence. Furthermore, CMS facilitates efficient content organization through categorization and tagging, enabling visitors to navigate and search for information effortlessly. It also supports version control, allowing users to track and manage revisions effectively. Scalability is a notable advantage of CMS, as it offers a wide range of plugins and extensions to integrate additional features into websites. From e-commerce functionality to social media integration, CMS adapts to evolving business needs. Additionally, CMS enhances collaborative workflows by allowing multiple user roles and permissions. This enables teams to collaborate effectively on content creation and management, streamlining processes and ensuring smooth coordination. In conclusion, CMS serves as a powerful tool in web development, simplifying content creation, customization, organization, scalability, and collaboration. With CMS, individuals and businesses can create dynamic and engaging websites, establishing a strong online presence with ease.Keywords: web development, content management systems, information technology, programming
Procedia PDF Downloads 873143 Neutron Irradiated Austenitic Stainless Steels: An Applied Methodology for Nanoindentation and Transmission Electron Microscopy Studies
Authors: P. Bublíkova, P. Halodova, H. K. Namburi, J. Stodolna, J. Duchon, O. Libera
Abstract:
Neutron radiation-induced microstructural changes cause degradation of mechanical properties and the lifetime reduction of reactor internals during nuclear power plant operation. Investigating the effects of neutron irradiation on mechanical properties of the irradiated material (hardening, embrittlement) is challenging and time-consuming. Although the fast neutron spectrum has the major influence on microstructural properties, the thermal neutron effect is widely investigated owing to Irradiation-Assisted Stress Corrosion Cracking firstly observed in BWR stainless steels. In this study, 300-series austenitic stainless steels used as material for NPP's internals were examined after neutron irradiation at ~ 15 dpa. Although several nanoindentation experimental publications are available to determine the mechanical properties of ion irradiated materials, less is available on neutron irradiated materials at high dpa tested in hot-cells. In this work, we present particular methodology developed to determine the mechanical properties of neutron irradiated steels by nanoindentation technique. Furthermore, radiation-induced damage in the specimens was investigated by High Resolution - Transmission Electron Microscopy (HR-TEM) that showed the defect features, particularly Frank loops, cavity microstructure, radiation-induced precipitates and radiation-induced segregation. The results of nanoindentation measurements and associated nanoscale defect features showed the effect of irradiation-induced hardening. We also propose methodologies to optimized sample preparation for nanoindentation and microscotructural studies.Keywords: nanoindentation, thermal neutrons, radiation hardening, transmission electron microscopy
Procedia PDF Downloads 1583142 The Formation of Mutual Understanding in Conversation: An Embodied Approach
Authors: Haruo Okabayashi
Abstract:
The mutual understanding in conversation is very important for human relations. This study investigates the mental function of the formation of mutual understanding between two people in conversation using the embodied approach. Forty people participated in this study. They are divided into pairs randomly. Four conversation situations between two (make/listen to fun or pleasant talk, make/listen to regrettable talk) are set for four minutes each, and the finger plethysmogram (200 Hz) of each participant is measured. As a result, the attractors of the participants who reported “I did not understand my partner” show the collapsed shape, which means the fluctuation of their rhythm is too small to match their partner’s rhythm, and their cross correlation is low. The autonomic balance of both persons tends to resonate during conversation, and both LLEs tend to resonate, too. In human history, in order for human beings as weak mammals to live, they may have been with others; that is, they have brought about resonating characteristics, which is called self-organization. However, the resonant feature sometimes collapses, depending on the lifestyle that the person was formed by himself after birth. It is difficult for people who do not have a lifestyle of mutual gaze to resonate their biological signal waves with others’. These people have features such as anxiety, fatigue, and confusion tendency. Mutual understanding is thought to be formed as a result of cooperation between the features of self-organization of the persons who are talking and the lifestyle indicated by mutual gaze. Such an entanglement phenomenon is called a nonlinear relation. By this research, it is found that the formation of mutual understanding is expressed by the rhythm of a biological signal showing a nonlinear relationship.Keywords: embodied approach, finger plethysmogram, mutual understanding, nonlinear phenomenon
Procedia PDF Downloads 2673141 Variability of the Speaker's Verbal and Non-Verbal Behaviour in the Process of Changing Social Roles in the English Marketing Discourse
Authors: Yuliia Skrynnik
Abstract:
This research focuses on the interaction of verbal, non-verbal, and super-verbal communicative components used by the speaker changing social roles in the marketing discourse. The changing/performing of social roles is implemented through communicative strategies and tactics, the structural, semantic, and linguo-pragmatic means of which are characterized by specific features and differ for the performance of either a role of a supplier or a customer. Communication within the marketing discourse is characterized by symmetrical roles’ relation between communicative opponents. The strategy of a supplier’s social role realization and the strategy of a customer’s role realization influence the discursive personality's linguistic repertoire in the marketing discourse. This study takes into account that one person can be both a supplier and a customer under different circumstances, thus, exploring the one individual who can be both a supplier and a customer. Cooperative and non-cooperative tactics are the instruments for the implementation of these strategies. In the marketing discourse, verbal and non-verbal behaviour of the speaker performing a customer’s social role is highly informative for speakers who perform the role of a supplier. The research methods include discourse, context-situational, pragmalinguistic, pragmasemantic analyses, the method of non-verbal components analysis. The methodology of the study includes 5 steps: 1) defining the configurations of speakers’ social roles on the selected material; 2) establishing the type of the discourse (marketing discourse); 3) describing the specific features of a discursive personality as a subject of the communication in the process of social roles realization; 4) selecting the strategies and tactics which direct the interaction in different roles configurations; 5) characterizing the structural, semantic and pragmatic features of the strategies and tactics realization, including the analysis of interaction between verbal and non-verbal components of communication. In the marketing discourse, non-verbal behaviour is usually spontaneous but not purposeful. Thus, the adequate decoding of a partner’s non-verbal behavior provides more opportunities both for the supplier and the customer. Super-verbal characteristics in the marketing discourse are crucial in defining the opponent's social status and social role at the initial stage of interaction. The research provides the scenario of stereotypical situations of the play of a supplier and a customer. The performed analysis has perspectives for further research connected with the study of discursive variativity of speakers' verbal and non-verbal behaviour considering the intercultural factor influencing the process of performing the social roles in the marketing discourse; and the formation of the methods for the scenario construction of non-stereotypical situations of social roles realization/change in the marketing discourse.Keywords: discursive personality, marketing discourse, non-verbal component of communication, social role, strategy, super-verbal component of communication, tactic, verbal component of communication
Procedia PDF Downloads 1233140 Prediction of Covid-19 Cases and Current Situation of Italy and Its Different Regions Using Machine Learning Algorithm
Authors: Shafait Hussain Ali
Abstract:
Since its outbreak in China, the Covid_19 19 disease has been caused by the corona virus SARS N coyote 2. Italy was the first Western country to be severely affected, and the first country to take drastic measures to control the disease. In start of December 2019, the sudden outbreaks of the Coronary Virus Disease was caused by a new Corona 2 virus (SARS-CO2) of acute respiratory syndrome in china city Wuhan. The World Health Organization declared the epidemic a public health emergency of international concern on January 30, 2020,. On February 14, 2020, 49,053 laboratory-confirmed deaths and 1481 deaths have been reported worldwide. The threat of the disease has forced most of the governments to implement various control measures. Therefore it becomes necessary to analyze the Italian data very carefully, in particular to investigates and to find out the present condition and the number of infected persons in the form of positive cases, death, hospitalized or some other features of infected persons will clear in simple form. So used such a model that will clearly shows the real facts and figures and also understandable to every readable person which can get some real benefit after reading it. The model used must includes(total positive cases, current positive cases, hospitalized patients, death, recovered peoples frequency rates ) all features that explains and clear the wide range facts in very simple form and helpful to administration of that country.Keywords: machine learning tools and techniques, rapid miner tool, Naive-Bayes algorithm, predictions
Procedia PDF Downloads 1083139 Specific Language Impirment in Kannada: Evidence Form a Morphologically Complex Language
Authors: Shivani Tiwari, Prathibha Karanth, B. Rajashekhar
Abstract:
Impairments of syntactic morphology are often considered central in children with Specific Language Impairment (SLI). In English and related languages, deficits of tense-related grammatical morphology could serve as a clinical marker of SLI. Yet, cross-linguistic studies on SLI in the recent past suggest that the nature and severity of morphosyntactic deficits in children with SLI varies with the language being investigated. Therefore, in the present study we investigated the morphosyntactic deficits in a group of children with SLI who speak Kannada, a morphologically complex Dravidian language spoken in Indian subcontinent. A group of 15 children with SLI participated in this study. Two more groups of typical developing children (15 each) matched for language and age to children with SLI, were included as control participants. All participants were assessed for morphosyntactic comprehension and expression using standardized language test and a spontaneous speech task. Results of the study showed that children with SLI differed significantly from age-matched but not language-matched control group, on tasks of both comprehension and expression of morphosyntax. This finding is, however, in contrast with the reports of English-speaking children with SLI who are reported to be poorer than younger MLU-matched children on tasks of morphosyntax. The observed difference in impairments of morphosyntax in Kannada-speaking children with SLI from English-speaking children with SLI is explained based on the morphological richness theory. The theory predicts that children with SLI perform relatively better in morphologically rich language due to occurrence of their frequent and consistent features that mark the morphological markers. The authors, therefore, conclude that language-specific features do influence manifestation of the disorder in children with SLI.Keywords: specific language impairment, morphosyntax, Kannada, manifestation
Procedia PDF Downloads 2443138 Integrated Geophysical Surveys for Sinkhole and Subsidence Vulnerability Assessment, in the West Rand Area of Johannesburg
Authors: Ramoshweu Melvin Sethobya, Emmanuel Chirenje, Mihlali Hobo, Simon Sebothoma
Abstract:
The recent surge in residential infrastructure development around the metropolitan areas of South Africa has necessitated conditions for thorough geotechnical assessments to be conducted prior to site developments to ensure human and infrastructure safety. This paper appraises the success in the application of multi-method geophysical techniques for the delineation of sinkhole vulnerability in a residential landscape. Geophysical techniques ERT, MASW, VES, Magnetics and gravity surveys were conducted to assist in mapping sinkhole vulnerability, using an existing sinkhole as a constraint at Venterspost town, West of Johannesburg city. A combination of different geophysical techniques and results integration from those proved to be useful in the delineation of the lithologic succession around sinkhole locality, and determining the geotechnical characteristics of each layer for its contribution to the development of sinkholes, subsidence and cavities at the vicinity of the site. Study results have also assisted in the determination of the possible depth extension of the currently existing sinkhole and the location of sites where other similar karstic features and sinkholes could form. Results of the ERT, VES and MASW surveys have uncovered dolomitic bedrock at varying depths around the sites, which exhibits high resistivity values in the range 2500-8000ohm.m and corresponding high velocities in the range 1000-2400 m/s. The dolomite layer was found to be overlain by a weathered chert-poor dolomite layer, which has resistivities between the range 250-2400ohm.m, and velocities ranging from 500-600m/s, from which the large sinkhole has been found to collapse/ cave in. A compiled 2.5D high resolution Shear Wave Velocity (Vs) map of the study area was created using 2D profiles of MASW data, offering insights into the prevailing lithological setup conducive for formation various types of karstic features around the site. 3D magnetic models of the site highlighted the regions of possible subsurface interconnections between the currently existing large sinkhole and the other subsidence feature at the site. A number of depth slices were used to detail the conditions near the sinkhole as depth increases. Gravity surveys results mapped the possible formational pathways for development of new karstic features around the site. Combination and correlation of different geophysical techniques proved useful in delineation of the site geotechnical characteristics and mapping the possible depth extend of the currently existing sinkhole.Keywords: resistivity, magnetics, sinkhole, gravity, karst, delineation, VES
Procedia PDF Downloads 813137 Theoretical Investigations and Simulation of Electromagnetic Ion Cyclotron Waves in the Earth’s Magnetosphere Through Magnetospheric Multiscale Mission
Authors: A. A. Abid
Abstract:
Wave-particle interactions are considered to be the paramount in the transmission of energy in collisionless space plasmas, where electromagnetic fields confined the charged particles movement. One of the distinct features of energy transfer in collisionless plasma is wave-particle interaction which is ubiquitous in space plasmas. The three essential populations of the inner magnetosphere are cold plasmaspheric plasmas, ring-currents, and radiation belts high energy particles. The transition region amid such populations initiates wave-particle interactions among distinct plasmas and the wave mode perceived in the magnetosphere is the electromagnetic ion cyclotron (EMIC) wave. These waves can interact with numerous particle species resonantly, accompanied by plasma particle heating is still in debate. In this work we paid particular attention to how EMIC waves impact plasma species, specifically how they affect the heating of electrons and ions during storm and substorm in the Magnetosphere. Using Magnetospheric Multiscale (MMS) mission and electromagnetic hybrid simulation, this project will investigate the energy transfer mechanism (e.g., Landau interactions, bounce resonance interaction, cyclotron resonance interaction, etc.) between EMIC waves and cold-warm plasma populations. Other features such as the production of EMIC waves and the importance of cold plasma particles in EMIC wave-particle interactions will also be worth exploring. Wave particle interactions, electromagnetic hybrid simulation, electromagnetic ion cyclotron (EMIC) waves, Magnetospheric Multiscale (MMS) mission, space plasmas, inner magnetosphereKeywords: MMS, magnetosphere, wave particle interraction, non-maxwellian distribution
Procedia PDF Downloads 633136 A Survey of Skin Cancer Detection and Classification from Skin Lesion Images Using Deep Learning
Authors: Joseph George, Anne Kotteswara Roa
Abstract:
Skin disease is one of the most common and popular kinds of health issues faced by people nowadays. Skin cancer (SC) is one among them, and its detection relies on the skin biopsy outputs and the expertise of the doctors, but it consumes more time and some inaccurate results. At the early stage, skin cancer detection is a challenging task, and it easily spreads to the whole body and leads to an increase in the mortality rate. Skin cancer is curable when it is detected at an early stage. In order to classify correct and accurate skin cancer, the critical task is skin cancer identification and classification, and it is more based on the cancer disease features such as shape, size, color, symmetry and etc. More similar characteristics are present in many skin diseases; hence it makes it a challenging issue to select important features from a skin cancer dataset images. Hence, the skin cancer diagnostic accuracy is improved by requiring an automated skin cancer detection and classification framework; thereby, the human expert’s scarcity is handled. Recently, the deep learning techniques like Convolutional neural network (CNN), Deep belief neural network (DBN), Artificial neural network (ANN), Recurrent neural network (RNN), and Long and short term memory (LSTM) have been widely used for the identification and classification of skin cancers. This survey reviews different DL techniques for skin cancer identification and classification. The performance metrics such as precision, recall, accuracy, sensitivity, specificity, and F-measures are used to evaluate the effectiveness of SC identification using DL techniques. By using these DL techniques, the classification accuracy increases along with the mitigation of computational complexities and time consumption.Keywords: skin cancer, deep learning, performance measures, accuracy, datasets
Procedia PDF Downloads 1323135 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria
Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov
Abstract:
This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model
Procedia PDF Downloads 663134 Micro-Meso 3D FE Damage Modelling of Woven Carbon Fibre Reinforced Plastic Composite under Quasi-Static Bending
Authors: Aamir Mubashar, Ibrahim Fiaz
Abstract:
This research presents a three-dimensional finite element modelling strategy to simulate damage in a quasi-static three-point bending analysis of woven twill 2/2 type carbon fibre reinforced plastic (CFRP) composite on a micro-meso level using cohesive zone modelling technique. A meso scale finite element model comprised of a number of plies was developed in the commercial finite element code Abaqus/explicit. The interfaces between the plies were explicitly modelled using cohesive zone elements to allow for debonding by crack initiation and propagation. Load-deflection response of the CRFP within the quasi-static range was obtained and compared with the data existing in the literature. This provided validation of the model at the global scale. The outputs resulting from the global model were then used to develop a simulation model capturing the micro-meso scale material features. The sub-model consisted of a refined mesh representative volume element (RVE) modelled in texgen software, which was later embedded with cohesive elements in the finite element software environment. The results obtained from the developed strategy were successful in predicting the overall load-deflection response and the damage in global and sub-model at the flexure limit of the specimen. Detailed analysis of the effects of the micro-scale features was carried out.Keywords: woven composites, multi-scale modelling, cohesive zone, finite element model
Procedia PDF Downloads 1393133 A Corpus-Based Study on the Lexical, Syntactic and Sequential Features across Interpreting Types
Authors: Qianxi Lv, Junying Liang
Abstract:
Among the various modes of interpreting, simultaneous interpreting (SI) is regarded as a ‘complex’ and ‘extreme condition’ of cognitive tasks while consecutive interpreters (CI) do not have to share processing capacity between tasks. Given that SI exerts great cognitive demand, it makes sense to posit that the output of SI may be more compromised than that of CI in the linguistic features. The bulk of the research has stressed the varying cognitive demand and processes involved in different modes of interpreting; however, related empirical research is sparse. In keeping with our interest in investigating the quantitative linguistic factors discriminating between SI and CI, the current study seeks to examine the potential lexical simplification, syntactic complexity and sequential organization mechanism with a self-made inter-model corpus of transcribed simultaneous and consecutive interpretation, translated speech and original speech texts with a total running word of 321960. The lexical features are extracted in terms of the lexical density, list head coverage, hapax legomena, and type-token ratio, as well as core vocabulary percentage. Dependency distance, an index for syntactic complexity and reflective of processing demand is employed. Frequency motif is a non-grammatically-bound sequential unit and is also used to visualize the local function distribution of interpreting the output. While SI is generally regarded as multitasking with high cognitive load, our findings evidently show that CI may impose heavier or taxing cognitive resource differently and hence yields more lexically and syntactically simplified output. In addition, the sequential features manifest that SI and CI organize the sequences from the source text in different ways into the output, to minimize the cognitive load respectively. We reasoned the results in the framework that cognitive demand is exerted both on maintaining and coordinating component of Working Memory. On the one hand, the information maintained in CI is inherently larger in volume compared to SI. On the other hand, time constraints directly influence the sentence reformulation process. The temporal pressure from the input in SI makes the interpreters only keep a small chunk of information in the focus of attention. Thus, SI interpreters usually produce the output by largely retaining the source structure so as to relieve the information from the working memory immediately after formulated in the target language. Conversely, CI interpreters receive at least a few sentences before reformulation, when they are more self-paced. CI interpreters may thus tend to retain and generate the information in a way to lessen the demand. In other words, interpreters cope with the high demand in the reformulation phase of CI by generating output with densely distributed function words, more content words of higher frequency values and fewer variations, simpler structures and more frequently used language sequences. We consequently propose a revised effort model based on the result for a better illustration of cognitive demand during both interpreting types.Keywords: cognitive demand, corpus-based, dependency distance, frequency motif, interpreting types, lexical simplification, sequential units distribution, syntactic complexity
Procedia PDF Downloads 1813132 Investigating Complement Clause Choice in Written Educated Nigerian English (ENE)
Authors: Juliet Udoudom
Abstract:
Inappropriate complement selection constitutes one of the major features of non-standard complementation in the Nigerian users of English output of sentence construction. This paper investigates complement clause choice in Written Educated Nigerian English (ENE) and offers some results. It aims at determining preferred and dispreferred patterns of complement clause selection in respect of verb heads in English by selected Nigerian users of English. The complementation data analyzed in this investigation were obtained from experimental tasks designed to elicit complement categories of Verb – Noun -, Adjective – and Prepositional – heads in English. Insights from the Government – Binding relations were employed in analyzing data, which comprised responses obtained from one hundred subjects to a picture elicitation exercise, a grammaticality judgement test, and a free composition task. The findings indicate a general tendency for clausal complements (CPs) introduced by the complementizer that to be preferred by the subjects studied. Of the 235 tokens of clausal complements which occurred in our corpus, 128 of them representing 54.46% were CPs headed by that, while whether – and if-clauses recorded 31.07% and 8.94%, respectively. The complement clause-type which recorded the lowest incidence of choice was the CP headed by the Complementiser, for with a 5.53% incident of occurrence. Further findings from the study indicate that semantic features of relevant embedding verb heads were not taken into consideration in the choice of complementisers which introduce the respective complement clauses, hence the that-clause was chosen to complement verbs like prefer. In addition, the dispreferred choice of the for-clause is explicable in terms of the fact that the respondents studied regard ‘for’ as a preposition, and not a complementiser.Keywords: complement, complement clause complement selection, complementisers, government-binding
Procedia PDF Downloads 1883131 Analysis of Kilistra (Gokyurt) Settlement within the Context of Traditional Residential Architecture
Authors: Esra Yaldız, Tugba Bulbul Bahtiyar, Dicle Aydın
Abstract:
Humans meet their need for shelter via housing which they structure in line with habits and necessities. In housing culture, traditional dwelling has an important role as a social and cultural transmitter. It provides concrete data by being planned in parallel with users’ life style and habits, having their own dynamics and components as well as their designs in harmony with nature, environment and the context they exist. Textures of traditional dwelling create a healthy and cozy living environment by means of adaptation to natural conditions, topography, climate, and context; utilization of construction materials found nearby and usage of traditional techniques and forms; and natural isolation of construction materials used. One of the examples of traditional settlements in Anatolia is Kilistra (Gökyurt) settlement of Konya province. Being among the important centers of Christianity in the past, besides having distinctive architecture, culture, natural features, and geographical differences (climate, geological structure, material), Kilistra can also be identified as a traditional settlement consisting of family, religious and economic structures as well as cultural interaction. The foundation of this study is the traditional residential texture of Kilistra with its unique features. The objective of this study is to assess the conformity of traditional residential texture of Kilistra with present topography, climatic data, and geographical values within the context of human scale construction, usage of green space, indigenous construction materials, construction form, building envelope, and space organization in housing.Keywords: traditional residential architecture, Kilistra, Anatolia, Konya
Procedia PDF Downloads 4133130 Automatic Content Curation of Visual Heritage
Authors: Delphine Ribes Lemay, Valentine Bernasconi, André Andrade, Lara DéFayes, Mathieu Salzmann, FréDéRic Kaplan, Nicolas Henchoz
Abstract:
Digitization and preservation of large heritage induce high maintenance costs to keep up with the technical standards and ensure sustainable access. Creating impactful usage is instrumental to justify the resources for long-term preservation. The Museum für Gestaltung of Zurich holds one of the biggest poster collections of the world from which 52’000 were digitised. In the process of building a digital installation to valorize the collection, one objective was to develop an algorithm capable of predicting the next poster to show according to the ones already displayed. The work presented here describes the steps to build an algorithm able to automatically create sequences of posters reflecting associations performed by curator and professional designers. The exposed challenge finds similarities with the domain of song playlist algorithms. Recently, artificial intelligence techniques and more specifically, deep-learning algorithms have been used to facilitate their generations. Promising results were found thanks to Recurrent Neural Networks (RNN) trained on manually generated playlist and paired with clusters of extracted features from songs. We used the same principles to create the proposed algorithm but applied to a challenging medium, posters. First, a convolutional autoencoder was trained to extract features of the posters. The 52’000 digital posters were used as a training set. Poster features were then clustered. Next, an RNN learned to predict the next cluster according to the previous ones. RNN training set was composed of poster sequences extracted from a collection of books from the Gestaltung Museum of Zurich dedicated to displaying posters. Finally, within the predicted cluster, the poster with the best proximity compared to the previous poster is selected. The mean square distance between features of posters was used to compute the proximity. To validate the predictive model, we compared sequences of 15 posters produced by our model to randomly and manually generated sequences. Manual sequences were created by a professional graphic designer. We asked 21 participants working as professional graphic designers to sort the sequences from the one with the strongest graphic line to the one with the weakest and to motivate their answer with a short description. The sequences produced by the designer were ranked first 60%, second 25% and third 15% of the time. The sequences produced by our predictive model were ranked first 25%, second 45% and third 30% of the time. The sequences produced randomly were ranked first 15%, second 29%, and third 55% of the time. Compared to designer sequences, and as reported by participants, model and random sequences lacked thematic continuity. According to the results, the proposed model is able to generate better poster sequencing compared to random sampling. Eventually, our algorithm is sometimes able to outperform a professional designer. As a next step, the proposed algorithm should include a possibility to create sequences according to a selected theme. To conclude, this work shows the potentiality of artificial intelligence techniques to learn from existing content and provide a tool to curate large sets of data, with a permanent renewal of the presented content.Keywords: Artificial Intelligence, Digital Humanities, serendipity, design research
Procedia PDF Downloads 1863129 Comparison of Different Machine Learning Algorithms for Solubility Prediction
Authors: Muhammet Baldan, Emel Timuçin
Abstract:
Molecular solubility prediction plays a crucial role in various fields, such as drug discovery, environmental science, and material science. In this study, we compare the performance of five machine learning algorithms—linear regression, support vector machines (SVM), random forests, gradient boosting machines (GBM), and neural networks—for predicting molecular solubility using the AqSolDB dataset. The dataset consists of 9981 data points with their corresponding solubility values. MACCS keys (166 bits), RDKit properties (20 properties), and structural properties(3) features are extracted for every smile representation in the dataset. A total of 189 features were used for training and testing for every molecule. Each algorithm is trained on a subset of the dataset and evaluated using metrics accuracy scores. Additionally, computational time for training and testing is recorded to assess the efficiency of each algorithm. Our results demonstrate that random forest model outperformed other algorithms in terms of predictive accuracy, achieving an 0.93 accuracy score. Gradient boosting machines and neural networks also exhibit strong performance, closely followed by support vector machines. Linear regression, while simpler in nature, demonstrates competitive performance but with slightly higher errors compared to ensemble methods. Overall, this study provides valuable insights into the performance of machine learning algorithms for molecular solubility prediction, highlighting the importance of algorithm selection in achieving accurate and efficient predictions in practical applications.Keywords: random forest, machine learning, comparison, feature extraction
Procedia PDF Downloads 423128 Polycystic Ovary Syndrome: Cervical Cytology Features and Its Association with Endometrial Cancer
Authors: Faezah Shekh Abdullah, Mohd. Azizuddin Mohd. Yussof, Komathy Thiagarajan, Hasnoorina Husin, Noor Azreena Abd Aziz
Abstract:
Polycystic ovary syndrome has been associated with multiple disorders such as endocrine disorder, metabolic syndrome, infertility, and endometrial cancer. Women with polycystic ovary syndrome (PCOS) are anticipated to develop three times more chances for endometrial cancer than women without PCOS. This study, therefore, was conducted to determine the association between polycystic ovary syndrome and endometrial cancer and to determine the cervical cytology features of PCOS. Patients attending the Subfertility Clinic of the National Population and Family Development Board were recruited and examined physically by medical practitioners. They were categorized into two groups; i) the PCOS group if they met Rotterdam Criteria 2004 and ii) the control group if they did not meet Rotterdam Criteria 2004. Cervical sampling was done on all patients via the Liquid-Based Cytology (LBC) method in the pre-and post-subfertility treatment. A total of 167 patients participated in the study, of which 79 belonged to the PCOS group and 88 to the control group. The findings showed no cervical and endometrial cancer cases in both groups. The Liquid-Based Cytology results in the PCOS group displayed more cases with cellular changes, i.e., benign inflammation, atrophic smear and Candida sp. infection. To conclude, no association was found between polycystic ovary syndrome and endometrial cancer. A more holistic study with a higher number of participants can further determine the association between endometrial cancer and PCOS. Furthermore, a longer duration between LBC pre- and post-subfertility treatment should be implied to observe changes in the cervical cells.Keywords: endometrial cancer, liquid-based cytology, PCOS, polycystic ovary syndrome
Procedia PDF Downloads 1443127 On the Bias and Predictability of Asylum Cases
Authors: Panagiota Katsikouli, William Hamilton Byrne, Thomas Gammeltoft-Hansen, Tijs Slaats
Abstract:
An individual who demonstrates a well-founded fear of persecution or faces real risk of being subjected to torture is eligible for asylum. In Danish law, the exact legal thresholds reflect those established by international conventions, notably the 1951 Refugee Convention and the 1950 European Convention for Human Rights. These international treaties, however, remain largely silent when it comes to how states should assess asylum claims. As a result, national authorities are typically left to determine an individual’s legal eligibility on a narrow basis consisting of an oral testimony, which may itself be hampered by several factors, including imprecise language interpretation, insecurity or lacking trust towards the authorities among applicants. The leaky ground, on which authorities must assess their subjective perceptions of asylum applicants' credibility, questions whether, in all cases, adjudicators make the correct decision. Moreover, the subjective element in these assessments raises questions on whether individual asylum cases could be afflicted by implicit biases or stereotyping amongst adjudicators. In fact, recent studies have uncovered significant correlations between decision outcomes and the experience and gender of the assigned judge, as well as correlations between asylum outcomes and entirely external events such as weather and political elections. In this study, we analyze a publicly available dataset containing approximately 8,000 summaries of asylum cases, initially rejected, and re-tried by the Refugee Appeals Board (RAB) in Denmark. First, we look for variations in the recognition rates, with regards to a number of applicants’ features: their country of origin/nationality, their identified gender, their identified religion, their ethnicity, whether torture was mentioned in their case and if so, whether it was supported or not, and the year the applicant entered Denmark. In order to extract those features from the text summaries, as well as the final decision of the RAB, we applied natural language processing and regular expressions, adjusting for the Danish language. We observed interesting variations in recognition rates related to the applicants’ country of origin, ethnicity, year of entry and the support or not of torture claims, whenever those were made in the case. The appearance (or not) of significant variations in the recognition rates, does not necessarily imply (or not) bias in the decision-making progress. None of the considered features, with the exception maybe of the torture claims, should be decisive factors for an asylum seeker’s fate. We therefore investigate whether the decision can be predicted on the basis of these features, and consequently, whether biases are likely to exist in the decisionmaking progress. We employed a number of machine learning classifiers, and found that when using the applicant’s country of origin, religion, ethnicity and year of entry with a random forest classifier, or a decision tree, the prediction accuracy is as high as 82% and 85% respectively. tentially predictive properties with regards to the outcome of an asylum case. Our analysis and findings call for further investigation on the predictability of the outcome, on a larger dataset of 17,000 cases, which is undergoing.Keywords: asylum adjudications, automated decision-making, machine learning, text mining
Procedia PDF Downloads 963126 Blood Flow Simulations to Understand the Role of the Distal Vascular Branches of Carotid Artery in the Stroke Prediction
Authors: Muhsin Kizhisseri, Jorg Schluter, Saleh Gharie
Abstract:
Atherosclerosis is the main reason of stroke, which is one of the deadliest diseases in the world. The carotid artery in the brain is the prominent location for atherosclerotic progression, which hinders the blood flow into the brain. The inclusion of computational fluid dynamics (CFD) into the diagnosis cycle to understand the hemodynamics of the patient-specific carotid artery can give insights into stroke prediction. Realistic outlet boundary conditions are an inevitable part of the numerical simulations, which is one of the major factors in determining the accuracy of the CFD results. The Windkessel model-based outlet boundary conditions can give more realistic characteristics of the distal vascular branches of the carotid artery, such as the resistance to the blood flow and compliance of the distal arterial walls. This study aims to find the most influential distal branches of the carotid artery by using the Windkessel model parameters in the outlet boundary conditions. The parametric study approach to Windkessel model parameters can include the geometrical features of the distal branches, such as radius and length. The incorporation of the variations of the geometrical features of the major distal branches such as the middle cerebral artery, anterior cerebral artery, and ophthalmic artery through the Windkessel model can aid in identifying the most influential distal branch in the carotid artery. The results from this study can help physicians and stroke neurologists to have a more detailed and accurate judgment of the patient's condition.Keywords: stroke, carotid artery, computational fluid dynamics, patient-specific, Windkessel model, distal vascular branches
Procedia PDF Downloads 2163125 Biophysical Features of Glioma-Derived Extracellular Vesicles as Potential Diagnostic Markers
Authors: Abhimanyu Thakur, Youngjin Lee
Abstract:
Glioma is a lethal brain cancer whose early diagnosis and prognosis are limited due to the dearth of a suitable technique for its early detection. Current approaches, including magnetic resonance imaging (MRI), computed tomography (CT), and invasive biopsy for the diagnosis of this lethal disease, hold several limitations, demanding an alternative method. Recently, extracellular vesicles (EVs) have been used in numerous biomarker studies, majorly exosomes and microvesicles (MVs), which are found in most of the cells and biofluids, including blood, cerebrospinal fluid (CSF), and urine. Remarkably, glioma cells (GMs) release a high number of EVs, which are found to cross the blood-brain-barrier (BBB) and impersonate the constituents of parent GMs including protein, and lncRNA; however, biophysical properties of EVs have not been explored yet as a biomarker for glioma. We isolated EVs from cell culture conditioned medium of GMs and regular primary culture, blood, and urine of wild-type (WT)- and glioma mouse models, and characterized by nano tracking analyzer, transmission electron microscopy, immunogold-EM, and differential light scanning. Next, we measured the biophysical parameters of GMs-EVs by using atomic force microscopy. Further, the functional constituents of EVs were examined by FTIR and Raman spectroscopy. Exosomes and MVs-derived from GMs, blood, and urine showed distinction biophysical parameters (roughness, adhesion force, and stiffness) and different from that of regular primary glial cells, WT-blood, and -urine, which can be attributed to the characteristic functional constituents. Therefore, biophysical features can be potential diagnostic biomarkers for glioma.Keywords: glioma, extracellular vesicles, exosomes, microvesicles, biophysical properties
Procedia PDF Downloads 1423124 Cladode features in Opuntia ficus-indica resistant cultivars to Dactylopius coccus Costa
Authors: Yemane Kahsay Berhe
Abstract:
The multipurpose cactus pear plant with great potential as a source of food and livestock feed faced a threat from Dactylopius spp in different countries. Specifically, D. coccus is an important pest damaging significant areas in Tigray-Ethiopia. Using pest-resistant cultivars is an important element of an integrated pest management strategy, and studying the mechanisms of resistance is vital. It can be chemical or physical, such as oxalate crystals and other cladode characteristics. Cladode features of six cultivars (three O. ficus-indica, two O. cochenillifera, and one O. robusta) were examined for resistance to D. coccus in a completely randomized design (CRD) with three replications. ‘Rojo Pelón’ (O. ficus-indica), ‘Robusta’ (O. robusta), and ‘Bioplástico’ (O. cochinillifera) are resistant cultivars; and ‘Atlixco’ and ‘Chicomostoc’ (O. ficus-indica) and ‘Nopalea’ (O. cochinillifera) are susceptible. Cultivars showed a significant difference in cladode weight in g, cladode length, cladode width, and cladode thickness in cm, where cladode thickness was higher in ‘Rojo Pelón’ followed by ‘Robusta’. Calcium oxalates number per mm was higher in ‘Bioplástico’ (20.7+2.08) followed by ‘Robusta’ (18.9+2.31) and ‘Rojo Pelón’ (15.9+0.34); and similarly, epidermis thickness found higher in ‘Bioplástico’ (0.21+0.032) and ‘Robusta’ (0.19+0.014), but similar with ‘Rojo Pelón’ (0.18+0.026). However, cuticle thickness didn’t show a difference among cultivars. Cladode thickness, calcium oxalates number, and epidermis thickness had positive correlations with resistance. These results demonstrate that calcium oxalates number and epidermis thickness might positively affect D. coccus resistance in O. ficus-indica. This feeding-barring role and the insect-plant interaction need to be studied.Keywords: cactus pear, resiatnce, druses, epidermis thickness
Procedia PDF Downloads 753123 Low-Cost Parking Lot Mapping and Localization for Home Zone Parking Pilot
Authors: Hongbo Zhang, Xinlu Tang, Jiangwei Li, Chi Yan
Abstract:
Home zone parking pilot (HPP) is a fast-growing segment in low-speed autonomous driving applications. It requires the car automatically cruise around a parking lot and park itself in a range of up to 100 meters inside a recurrent home/office parking lot, which requires precise parking lot mapping and localization solution. Although Lidar is ideal for SLAM, the car OEMs favor a low-cost fish-eye camera based visual SLAM approach. Recent approaches have employed segmentation models to extract semantic features and improve mapping accuracy, but these AI models are memory unfriendly and computationally expensive, making deploying on embedded ADAS systems difficult. To address this issue, we proposed a new method that utilizes object detection models to extract robust and accurate parking lot features. The proposed method could reduce computational costs while maintaining high accuracy. Once combined with vehicles’ wheel-pulse information, the system could construct maps and locate the vehicle in real-time. This article will discuss in detail (1) the fish-eye based Around View Monitoring (AVM) with transparent chassis images as the inputs, (2) an Object Detection (OD) based feature point extraction algorithm to generate point cloud, (3) a low computational parking lot mapping algorithm and (4) the real-time localization algorithm. At last, we will demonstrate the experiment results with an embedded ADAS system installed on a real car in the underground parking lot.Keywords: ADAS, home zone parking pilot, object detection, visual SLAM
Procedia PDF Downloads 67