Search results for: Object Identification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4065

Search results for: Object Identification

2835 The Regional Center for Business Quality of the University Center of the Valleys: Transiting to an Entrepreneurial University

Authors: Carlos Alberto Santamaria Velasco

Abstract:

The study object of this chapter analyzes the case of the Centro Regional para la Calidad Empresarial (CreCE) starting from an analysis of the theoretical discussion about the universities as actors of the development and generation of enterprises. As well as the promotion of the entrepreneurial culture that they carry out in their environment of influence as part of the linkage and extension actions that have as one of their substantive functions, in addition to teaching and research. The objective is to know the theoretical discussion and the state of art about the entrepreneurial universities from the institutional theory of Douglas North, carrying out a theoretical analysis of the formal and informal factors from the universities linking the specific case of the CReCE. A literature review was carried out in the main journals in the topic of entrepreneurship, about the factors that influence the creation and development of entrepreneurial universities, complementing research in the study of a particular case, CreCE, and how this affects in the transformation of the CUVALLES(Centro Universitario de los Valles) in its way towards an entrepreneurial university.

Keywords: entrepreneurial university, institutional theory, university, entrepreneurial universities

Procedia PDF Downloads 233
2834 Localising Gauss’s Law and the Electric Charge Induction on a Conducting Sphere

Authors: Sirapat Lookrak, Anol Paisal

Abstract:

Space debris has numerous manifestations, including ferro-metalize and non-ferrous. The electric field will induce negative charges to split from positive charges inside the space debris. In this research, we focus only on conducting materials. The assumption is that the electric charge density of a conducting surface is proportional to the electric field on that surface due to Gauss's Law. We are trying to find the induced charge density from an external electric field perpendicular to a conducting spherical surface. An object is a sphere on which the external electric field is not uniform. The electric field is, therefore, considered locally. The localised spherical surface is a tangent plane, so the Gaussian surface is a very small cylinder, and every point on a spherical surface has its own cylinder. The electric field from a circular electrode has been calculated in near-field and far-field approximation and shown Explanation Touchless maneuvering space debris orbit properties. The electric charge density calculation from a near-field and far-field approximation is done.

Keywords: near-field approximation, far-field approximation, localized Gauss’s law, electric charge density

Procedia PDF Downloads 132
2833 Improved Wi-Fi Backscatter System for Multi-to-Multi Communication

Authors: Chang-Bin Ha, Yong-Jun Kim, Dong-Hyun Ha, Hyoung-Kyu Song

Abstract:

The conventional Wi-Fi back scatter system can only process one-to-one communication between the Wi-Fi reader and the Wi-Fi tag. For improvement of throughput of the conventional system, this paper proposes the multi-to-multi communication system. In the proposed system, the interference by the multi-to-multi communication is effectively cancelled by the orthogonal multiple access based on the identification code of the tag. Although the overhead is generated by the procedure for the multi-to-multi communication, because the procedure is processed by the Wi-Fi protocol, the overhead is insignificant for the entire communication procedure. From the numerical results, it is confirmed that the proposed system has nearly proportional increased throughput in according to the number of the tag that simultaneously participates in communication.

Keywords: backscatter, multi-to-multi communication, orthogonality, Wi-Fi

Procedia PDF Downloads 510
2832 The Influence of Neural Synchrony on Auditory Middle Latency and Late Latency Responses and Its Correlation with Audiological Profile in Individuals with Auditory Neuropathy

Authors: P. Renjitha, P. Hari Prakash

Abstract:

Auditory neuropathy spectrum disorder (ANSD) is an auditory disorder with normal cochlear outer hair cell function and disrupted auditory nerve function. It results in unique clinical characteristic with absent auditory brainstem response (ABR), absent acoustic reflex and the presence of otoacoustic emissions (OAE) and cochlear microphonics. The lesion site could be at cochlear inner hair cells, the synapse between the inner hair cells and type I auditory nerve fibers, and/or the auditory nerve itself. But the literatures on synchrony at higher auditory system are sporadic and are less understood. It might be interesting to see if there is a recovery of neural synchrony at higher auditory centers. Also, does the level at which the auditory system recovers with adequate synchrony to the extent of observable evoke response potentials (ERPs) can predict speech perception? In the current study, eight ANSD participants and healthy controls underwent detailed audiological assessment including ABR, auditory middle latency response (AMLR), and auditory late latency response (ALLR). AMLR was recorded for clicks and ALLR was evoked using 500Hz and 2 kHz tone bursts. Analysis revealed that the participant could be categorized into three groups. Group I (2/8) where ALLR was present only for 2kHz tone burst. Group II (4/8), where AMLR was absent and ALLR was seen for both the stimuli. Group III (2/8) consisted individuals with identifiable AMLR and ALLR for all the stimuli. The highest speech identification sore observed in ANSD group was 30% and hence considered having poor speech perception. Overall test result indicates that the site of neural synchrony recovery could be varying across individuals with ANSD. Some individuals show recovery of neural synchrony at the thalamocortical level while others show the same only at the cortical level. Within ALLR itself there could be variation across stimuli again could be related to neural synchrony. Nevertheless, none of these patterns could possible explain the speech perception ability of the individuals. Hence, it could be concluded that neural synchrony as measured by evoked potentials could not be a good clinical predictor speech perception.

Keywords: auditory late latency response, auditory middle latency response, auditory neuropathy spectrum disorder, correlation with speech identification score

Procedia PDF Downloads 149
2831 Location Management in Wireless Sensor Networks with Mobility

Authors: Amrita Anil Agashe, Sumant Tapas, Ajay Verma Yogesh Sonavane, Sourabh Yeravar

Abstract:

Due to advancement in MEMS technology today wireless sensors network has gained a lot of importance. The wide range of its applications includes environmental and habitat monitoring, object localization, target tracking, security surveillance etc. Wireless sensor networks consist of tiny sensor devices called as motes. The constrained computation power, battery power, storage capacity and communication bandwidth of the tiny motes pose challenging problems in the design and deployment of such systems. In this paper, we propose a ubiquitous framework for Real-Time Tracking, Sensing and Management System using IITH motes. Also, we explain the algorithm that we have developed for location management in wireless sensor networks with the aspect of mobility. Our developed framework and algorithm can be used to detect emergency events and safety threats and provides warning signals to handle the emergency.

Keywords: mobility management, motes, multihop, wireless sensor networks

Procedia PDF Downloads 418
2830 Internal Capital Market Efficiency Study Based on Improved Cash Flow Sensitivity Coefficient - Take Tomorrow Group as an Example

Authors: Peng Lu, Liu Ting

Abstract:

Because of the difficulty of financing from the external capital market, the reorganization and merger of private enterprises have formed a family group, seeking the help of the internal capital market to alleviate the capital demand. However, the inefficiency of the internal capital market can damage the effect it should have played, and even hinder the development of enterprises. This paper takes the "Tomorrow Group" as the research object to carry on the case analysis. After using the improved cash flow sensitivity coefficient to measure the efficiency of the internal capital market of Tomorrow Group, the inefficiency phenomenon is found. Then the analysis reveals that the reasons for its inefficiency include that the pyramidal equity structure is conducive to control, the separation of cash flow rights and control rights, the concentration of equity leads to poor balance, the abandonment of real industries and information asymmetry.

Keywords: tomorrow group, internal capital market, related-party transactions, Baotou tomorrow technology Co., LTD

Procedia PDF Downloads 136
2829 Optimizing the Probabilistic Neural Network Training Algorithm for Multi-Class Identification

Authors: Abdelhadi Lotfi, Abdelkader Benyettou

Abstract:

In this work, a training algorithm for probabilistic neural networks (PNN) is presented. The algorithm addresses one of the major drawbacks of PNN, which is the size of the hidden layer in the network. By using a cross-validation training algorithm, the number of hidden neurons is shrunk to a smaller number consisting of the most representative samples of the training set. This is done without affecting the overall architecture of the network. Performance of the network is compared against performance of standard PNN for different databases from the UCI database repository. Results show an important gain in network size and performance.

Keywords: classification, probabilistic neural networks, network optimization, pattern recognition

Procedia PDF Downloads 262
2828 Scientific Theoretical Fundamentals of Comparative Analysis

Authors: Khalliyeva Gulnoz Iskandarovna, Mannonova Feruzabonu Sherali Qizi

Abstract:

A scientific field called comparative literature or literary comparative studies compares two or more literary phenomena. One of the most important scientific fields nowadays, when global social, cultural, and literary relations are growing daily, is comparative literature. Any comparative investigation reveals shared and unique characteristics of literary phenomena, which provide the cornerstone for the creation of overarching theoretical principles that apply to all literature. Comparative analysis consists of objects, and they are their constituents. For researchers, it is enough to know this. Comparative analysis, in addition to the above-mentioned actions, also focuses on comparing the components of the objects of analysis with each other. The purpose of this article is to investigate comparative analysis in literature and to identify similarities and differences between comparable objects. Students, teachers, and researchers should be able to describe comparative research techniques and their fundamental ideas when studying this topic. They should also have a basic understanding of comparative literature and their summary.

Keywords: object, natural, social, spiritual, epistemological, logical, methodological, methodological, axiological tasks, stages of comparison, environment, internal features, and typical situations

Procedia PDF Downloads 59
2827 Model Predictive Controller for Pasteurization Process

Authors: Tesfaye Alamirew Dessie

Abstract:

Our study focuses on developing a Model Predictive Controller (MPC) and evaluating it against a traditional PID for a pasteurization process. Utilizing system identification from the experimental data, the dynamics of the pasteurization process were calculated. Using best fit with data validation, residual, and stability analysis, the quality of several model architectures was evaluated. The validation data fit the auto-regressive with exogenous input (ARX322) model of the pasteurization process by roughly 80.37 percent. The ARX322 model structure was used to create MPC and PID control techniques. After comparing controller performance based on settling time, overshoot percentage, and stability analysis, it was found that MPC controllers outperform PID for those parameters.

Keywords: MPC, PID, ARX, pasteurization

Procedia PDF Downloads 163
2826 Multi-scale Geographic Object-Based Image Analysis (GEOBIA) Approach to Segment a Very High Resolution Images for Extraction of New Degraded Zones. Application to The Region of Mécheria in The South-West of Algeria

Authors: Bensaid A., Mostephaoui T., Nedjai R.

Abstract:

A considerable area of Algerian lands are threatened by the phenomenon of wind erosion. For a long time, wind erosion and its associated harmful effects on the natural environment have posed a serious threat, especially in the arid regions of the country. In recent years, as a result of increases in the irrational exploitation of natural resources (fodder) and extensive land clearing, wind erosion has particularly accentuated. The extent of degradation in the arid region of the Algerian Mécheriadepartment generated a new situation characterized by the reduction of vegetation cover, the decrease of land productivity, as well as sand encroachment on urban development zones. In this study, we attempt to investigate the potential of remote sensing and geographic information systems for detecting the spatial dynamics of the ancient dune cords based on the numerical processing of PlanetScope PSB.SB sensors images by September 29, 2021. As a second step, we prospect the use of a multi-scale geographic object-based image analysis (GEOBIA) approach to segment the high spatial resolution images acquired on heterogeneous surfaces that vary according to human influence on the environment. We have used the fractal net evolution approach (FNEA) algorithm to segment images (Baatz&Schäpe, 2000). Multispectral data, a digital terrain model layer, ground truth data, a normalized difference vegetation index (NDVI) layer, and a first-order texture (entropy) layer were used to segment the multispectral images at three segmentation scales, with an emphasis on accurately delineating the boundaries and components of the sand accumulation areas (Dune, dunes fields, nebka, and barkhane). It is important to note that each auxiliary data contributed to improve the segmentation at different scales. The silted areas were classified using a nearest neighbor approach over the Naâma area using imagery. The classification of silted areas was successfully achieved over all study areas with an accuracy greater than 85%, although the results suggest that, overall, a higher degree of landscape heterogeneity may have a negative effect on segmentation and classification. Some areas suffered from the greatest over-segmentation and lowest mapping accuracy (Kappa: 0.79), which was partially attributed to confounding a greater proportion of mixed siltation classes from both sandy areas and bare ground patches. This research has demonstrated a technique based on very high-resolution images for mapping sanded and degraded areas using GEOBIA, which can be applied to the study of other lands in the steppe areas of the northern countries of the African continent.

Keywords: land development, GIS, sand dunes, segmentation, remote sensing

Procedia PDF Downloads 109
2825 Application of Advanced Remote Sensing Data in Mineral Exploration in the Vicinity of Heavy Dense Forest Cover Area of Jharkhand and Odisha State Mining Area

Authors: Hemant Kumar, R. N. K. Sharma, A. P. Krishna

Abstract:

The study has been carried out on the Saranda in Jharkhand and a part of Odisha state. Geospatial data of Hyperion, a remote sensing satellite, have been used. This study has used a wide variety of patterns related to image processing to enhance and extract the mining class of Fe and Mn ores.Landsat-8, OLI sensor data have also been used to correctly explore related minerals. In this way, various processes have been applied to increase the mineralogy class and comparative evaluation with related frequency done. The Hyperion dataset for hyperspectral remote sensing has been specifically verified as an effective tool for mineral or rock information extraction within the band range of shortwave infrared used. The abundant spatial and spectral information contained in hyperspectral images enables the differentiation of different objects of any object into targeted applications for exploration such as exploration detection, mining.

Keywords: Hyperion, hyperspectral, sensor, Landsat-8

Procedia PDF Downloads 124
2824 Convolutional Neural Network Based on Random Kernels for Analyzing Visual Imagery

Authors: Ja-Keoung Koo, Kensuke Nakamura, Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Byung-Woo Hong

Abstract:

The machine learning techniques based on a convolutional neural network (CNN) have been actively developed and successfully applied to a variety of image analysis tasks including reconstruction, noise reduction, resolution enhancement, segmentation, motion estimation, object recognition. The classical visual information processing that ranges from low level tasks to high level ones has been widely developed in the deep learning framework. It is generally considered as a challenging problem to derive visual interpretation from high dimensional imagery data. A CNN is a class of feed-forward artificial neural network that usually consists of deep layers the connections of which are established by a series of non-linear operations. The CNN architecture is known to be shift invariant due to its shared weights and translation invariance characteristics. However, it is often computationally intractable to optimize the network in particular with a large number of convolution layers due to a large number of unknowns to be optimized with respect to the training set that is generally required to be large enough to effectively generalize the model under consideration. It is also necessary to limit the size of convolution kernels due to the computational expense despite of the recent development of effective parallel processing machinery, which leads to the use of the constantly small size of the convolution kernels throughout the deep CNN architecture. However, it is often desired to consider different scales in the analysis of visual features at different layers in the network. Thus, we propose a CNN model where different sizes of the convolution kernels are applied at each layer based on the random projection. We apply random filters with varying sizes and associate the filter responses with scalar weights that correspond to the standard deviation of the random filters. We are allowed to use large number of random filters with the cost of one scalar unknown for each filter. The computational cost in the back-propagation procedure does not increase with the larger size of the filters even though the additional computational cost is required in the computation of convolution in the feed-forward procedure. The use of random kernels with varying sizes allows to effectively analyze image features at multiple scales leading to a better generalization. The robustness and effectiveness of the proposed CNN based on random kernels are demonstrated by numerical experiments where the quantitative comparison of the well-known CNN architectures and our models that simply replace the convolution kernels with the random filters is performed. The experimental results indicate that our model achieves better performance with less number of unknown weights. The proposed algorithm has a high potential in the application of a variety of visual tasks based on the CNN framework. Acknowledgement—This work was supported by the MISP (Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by IITP, and NRF-2014R1A2A1A11051941, NRF2017R1A2B4006023.

Keywords: deep learning, convolutional neural network, random kernel, random projection, dimensionality reduction, object recognition

Procedia PDF Downloads 290
2823 Integration of Fuzzy Logic in the Representation of Knowledge: Application in the Building Domain

Authors: Hafida Bouarfa, Mohamed Abed

Abstract:

The main object of our work is the development and the validation of a system indicated Fuzzy Vulnerability. Fuzzy Vulnerability uses a fuzzy representation in order to tolerate the imprecision during the description of construction. At the the second phase, we evaluated the similarity between the vulnerability of a new construction and those of the whole of the historical cases. This similarity is evaluated on two levels: 1) individual similarity: bases on the fuzzy techniques of aggregation; 2) Global similarity: uses the increasing monotonous linguistic quantifiers (RIM) to combine the various individual similarities between two constructions. The third phase of the process of Fuzzy Vulnerability consists in using vulnerabilities of historical constructions narrowly similar to current construction to deduce its estimate vulnerability. We validated our system by using 50 cases. We evaluated the performances of Fuzzy Vulnerability on the basis of two basic criteria, the precision of the estimates and the tolerance of the imprecision along the process of estimation. The comparison was done with estimates made by tiresome and long models. The results are satisfactory.

Keywords: case based reasoning, fuzzy logic, fuzzy case based reasoning, seismic vulnerability

Procedia PDF Downloads 292
2822 Object Oriented Software Engineering Approach to Industrial Information System Design and Implementation

Authors: Issa Hussein Manita

Abstract:

This paper presents an example of industrial information system design and implementation (IIDC), the most common software engineering design steps that are applied to the different design stages. We are going through the life cycle of software system development. We start by a study of system requirement and end with testing and delivering system, going by system design and coding, program integration and system integration step. The most modern software design tools available used in the design this includes, but not limited to, Unified Modeling Language (UML), system modeling, SQL server side application, uses case analysis, design and testing as applied to information processing systems. The system is designed to perform tasks specified by the client with real data. By the end of the implementation of the system, default or user defined acceptance policy to provide an overall score as an indication of the system performance is used. To test the reliability of he designed system, it is tested in different environment and different work burden such as multi-user environment.

Keywords: software engineering, design, system requirement, integration, unified modeling language

Procedia PDF Downloads 570
2821 The First Transcriptome Assembly of Marama Bean: An African Orphan Crop

Authors: Ethel E. Phiri, Lionel Hartzenberg, Percy Chimwamuromba, Emmanuel Nepolo, Jens Kossmann, James R. Lloyd

Abstract:

Orphan crops are underresearched and underutilized food plant species that have not been categorized as major food crops, but have the potential to be economically and agronomically significant. They have been documented to have the ability to tolerate extreme environmental conditions. However, limited research has been conducted to uncover their potential as food crop species. The New Partnership for Africa’s Development (NEPAD) has classified Marama bean, Tylosema esculentum, as an orphan crop. The plant is one of the 101 African orphan crops that must have their genomes sequenced, assembled, and annotated in the foreseeable future. Marama bean is a perennial leguminous plant that primarily grows in poor, arid soils in southern Africa. The plants produce large tubers that can weigh as much as 200kg. While the foliage provides fodder, the tuber is carbohydrate rich and is a staple food source for rural communities in Namibia. Also, the edible seeds are protein- and oil-rich. Marama Bean plants respond rapidly to increased temperatures and severe water scarcity without extreme consequences. Advances in molecular biology and biotechnology have made it possible to effectively transfer technologies between model- and major crops to orphan crops. In this research, the aim was to assemble the first transcriptomic analysis of Marama Bean RNA-sequence data. Many model plant species have had their genomes sequenced and their transcriptomes assembled. Therefore the availability of transcriptome data for a non-model crop plant species will allow for gene identification and comparisons between various species. The data has been sequenced using the Ilumina Hiseq 2500 sequencing platform. Data analysis is underway. In essence, this research will eventually evaluate the potential use of Marama Bean as a crop species to improve its value in agronomy. data for a non-model crop plant species will allow for gene identification and comparisons between various species. The data has been sequenced using the Ilumina Hiseq 2500 sequencing platform. Data analysis is underway. In essence, this researc will eventually evaluate the potential use of Marama bean as a crop species to improve its value in agronomy.

Keywords: 101 African orphan crops, RNA-Seq, Tylosema esculentum, underutilised crop plants

Procedia PDF Downloads 360
2820 ViraPart: A Text Refinement Framework for Automatic Speech Recognition and Natural Language Processing Tasks in Persian

Authors: Narges Farokhshad, Milad Molazadeh, Saman Jamalabbasi, Hamed Babaei Giglou, Saeed Bibak

Abstract:

The Persian language is an inflectional subject-object-verb language. This fact makes Persian a more uncertain language. However, using techniques such as Zero-Width Non-Joiner (ZWNJ) recognition, punctuation restoration, and Persian Ezafe construction will lead us to a more understandable and precise language. In most of the works in Persian, these techniques are addressed individually. Despite that, we believe that for text refinement in Persian, all of these tasks are necessary. In this work, we proposed a ViraPart framework that uses embedded ParsBERT in its core for text clarifications. First, used the BERT variant for Persian followed by a classifier layer for classification procedures. Next, we combined models outputs to output cleartext. In the end, the proposed model for ZWNJ recognition, punctuation restoration, and Persian Ezafe construction performs the averaged F1 macro scores of 96.90%, 92.13%, and 98.50%, respectively. Experimental results show that our proposed approach is very effective in text refinement for the Persian language.

Keywords: Persian Ezafe, punctuation, ZWNJ, NLP, ParsBERT, transformers

Procedia PDF Downloads 216
2819 Hybridization as a Process of Refusal of Imposed Popular Architecture

Authors: Jorge Eliseo Muñiz-Gutierrez, Daniel Olvera-García, Cristina Sotelo-Salas

Abstract:

The objective of this research is to allow the understanding of the hybridization process shown in culture through the architecture of mass production for the purpose of consumption, taking as a case study the mass-built housing of the city of Mexicali, Mexico. The methodology is born from the hermeneutical study of the meta-modified architectural object, which guided the research with a qualitative focus to be carried out in two stages, the first is based on the literature review regarding cultural hybridization, and the second stage is carried out in through an ethnographic study of the cultural exploration of the contextual landscape produced by the houses located in popular neighborhoods of the city of Mexicali, Mexico. The research shows that there is an unconscious hybridization process, the birth of a mixture of impositions guided by the popular and the personal aspirations of the inhabitant. The study presents the possibilities of a home and the relationship with its inhabitant and, in turn, its effects on the context and its contribution to culture through hybridization.

Keywords: hybridization, architectural landscape, architecture, mass housing

Procedia PDF Downloads 169
2818 Exploiting the Potential of Fabric Phase Sorptive Extraction for Forensic Food Safety: Analysis of Food Samples in Cases of Drug Facilitated Crimes

Authors: Bharti Jain, Rajeev Jain, Abuzar Kabir, Torki Zughaibi, Shweta Sharma

Abstract:

Drug-facilitated crimes (DFCs) entail the use of a single drug or a mixture of drugs to render a victim unable. Traditionally, biological samples have been gathered from victims and conducted analysis to establish evidence of drug administration. Nevertheless, the rapid metabolism of various drugs and delays in analysis can impede the identification of such substances. For this, the present article describes a rapid, sustainable, highly efficient and miniaturized protocol for the identification and quantification of three sedative-hypnotic drugs, namely diazepam, chlordiazepoxide and ketamine in alcoholic beverages and complex food samples (cream of biscuit, flavored milk, juice, cake, tea, sweets and chocolate). The methodology involves utilizing fabric phase sorptive extraction (FPSE) to extract diazepam (DZ), chlordiazepoxide (CDP), and ketamine (KET). Subsequently, the extracted samples are subjected to analysis using gas chromatography-mass spectrometry (GC-MS). Several parameters, including the type of membrane, pH, agitation time and speed, ionic strength, sample volume, elution volume and time, and type of elution solvent, were screened and thoroughly optimized. Sol-gel Carbowax 20M (CW-20M) has demonstrated the most effective extraction efficiency for the target analytes among all evaluated membranes. Under optimal conditions, the method displayed linearity within the range of 0.3–10 µg mL–¹ (or µg g–¹), exhibiting a coefficient of determination (R2) ranging from 0.996–0.999. The limits of detection (LODs) and limits of quantification (LOQs) for liquid samples range between 0.020-0.069 µg mL-¹ and 0.066-0.22 µg mL-¹, respectively. Correspondingly, the LODs for solid samples ranged from 0.056-0.090 µg g-¹, while the LOQs ranged from 0.18-0.29 µg g-¹. Notably, the method showcased better precision, with repeatability and reproducibility both below 5% and 10%, respectively. Furthermore, the FPSE-GC-MS method proved effective in determining diazepam (DZ) in forensic food samples connected to drug-facilitated crimes (DFCs). Additionally, the proposed method underwent evaluation for its whiteness using the RGB12 algorithm.

Keywords: drug facilitated crime, fabric phase sorptive extraction, food forensics, white analytical chemistry

Procedia PDF Downloads 70
2817 Analog Voltage Inverter Drive for Capacitive Load with Adaptive Gain Control

Authors: Sun-Ki Hong, Yong-Ho Cho, Ki-Seok Kim, Tae-Sam Kang

Abstract:

Piezoelectric actuator is treated as RC load when it is modeled electrically. For some piezoelectric actuator applications, arbitrary voltage is required to actuate. Especially for unidirectional arbitrary voltage driving like as sine wave, some special inverter with circuit that can charge and discharge the capacitive energy can be used. In this case, the difference between power supply level and the object voltage level for RC load is varied. Because the control gain is constant, the controlled output is not uniform according to the voltage difference. In this paper, for charge and discharge circuit for unidirectional arbitrary voltage driving for piezoelectric actuator, the controller gain is controlled according to the voltage difference. With the proposed simple idea, the load voltage can have controlled smoothly although the voltage difference is varied. The appropriateness is proved from the simulation of the proposed circuit.

Keywords: analog voltage inverter, capacitive load, gain control, dc-dc converter, piezoelectric, voltage waveform

Procedia PDF Downloads 655
2816 Quantitative Polymerase Chain Reaction Analysis of Phytoplankton Composition and Abundance to Assess Eutrophication: A Multi-Year Study in Twelve Large Rivers across the United States

Authors: Chiqian Zhang, Kyle D. McIntosh, Nathan Sienkiewicz, Ian Struewing, Erin A. Stelzer, Jennifer L. Graham, Jingrang Lu

Abstract:

Phytoplankton plays an essential role in freshwater aquatic ecosystems and is the primary group synthesizing organic carbon and providing food sources or energy to ecosystems. Therefore, the identification and quantification of phytoplankton are important for estimating and assessing ecosystem productivity (carbon fixation), water quality, and eutrophication. Microscopy is the current gold standard for identifying and quantifying phytoplankton composition and abundance. However, microscopic analysis of phytoplankton is time-consuming, has a low sample throughput, and requires deep knowledge and rich experience in microbial morphology to implement. To improve this situation, quantitative polymerase chain reaction (qPCR) was considered for phytoplankton identification and quantification. Using qPCR to assess phytoplankton composition and abundance, however, has not been comprehensively evaluated. This study focused on: 1) conducting a comprehensive performance comparison of qPCR and microscopy techniques in identifying and quantifying phytoplankton and 2) examining the use of qPCR as a tool for assessing eutrophication. Twelve large rivers located throughout the United States were evaluated using data collected from 2017 to 2019 to understand the relation between qPCR-based phytoplankton abundance and eutrophication. This study revealed that temporal variation of phytoplankton abundance in the twelve rivers was limited within years (from late spring to late fall) and among different years (2017, 2018, and 2019). Midcontinent rivers had moderately greater phytoplankton abundance than eastern and western rivers, presumably because midcontinent rivers were more eutrophic. The study also showed that qPCR- and microscope-determined phytoplankton abundance had a significant positive linear correlation (adjusted R² 0.772, p-value < 0.001). In addition, phytoplankton abundance assessed via qPCR showed promise as an indicator of the eutrophication status of those rivers, with oligotrophic rivers having low phytoplankton abundance and eutrophic rivers having (relatively) high phytoplankton abundance. This study demonstrated that qPCR could serve as an alternative tool to traditional microscopy for phytoplankton quantification and eutrophication assessment in freshwater rivers.

Keywords: phytoplankton, eutrophication, river, qPCR, microscopy, spatiotemporal variation

Procedia PDF Downloads 101
2815 Small Town Big Urban Issues the Case of Kiryat Ono, Israel

Authors: Ruth Shapira

Abstract:

Introduction: The rapid urbanization of the last century confronts planners, regulatory bodies, developers and most of all – the public with seemingly unsolved conflicts regarding values, capital, and wellbeing of the built and un-built urban space. This is reflected in the quality of the urban form and life which has known no significant progress in the last 2-3 decades despite the on-growing urban population. It is the objective of this paper to analyze some of these fundamental issues through the case study of a relatively small town in the center of Israel (Kiryat-Ono, 100,000 inhabitants), unfold the deep structure of qualities versus disruptors, present some cure that we have developed to bridge over and humbly suggest a practice that may be generic for similar cases. Basic Methodologies: The OBJECT, the town of Kiryat Ono, shall be experimented upon in a series of four action processes: De-composition, Re-composition, the Centering process and, finally, Controlled Structural Disintegration. Each stage will be based on facts, analysis of previous multidisciplinary interventions on various layers – and the inevitable reaction of the OBJECT, leading to the conclusion based on innovative theoretical and practical methods that we have developed and that we believe are proper for the open ended network, setting the rules for the contemporary urban society to cluster by. The Study: Kiryat Ono, was founded 70 years ago as an agricultural settlement and rapidly turned into an urban entity. In spite the massive intensification, the original DNA of the old small town was still deeply embedded, mostly in the quality of the public space and in the sense of clustered communities. In the past 20 years, the recent demand for housing has been addressed to on the national level with recent master plans and urban regeneration policies mostly encouraging individual economic initiatives. Unfortunately, due to the obsolete existing planning platform the present urban renewal is characterized by pressure of developers, a dramatic change in building scale and widespread disintegration of the existing urban and social tissue. Our office was commissioned to conceptualize two master plans for the two contradictory processes of Kiryat Ono’s future: intensification and conservation. Following a comprehensive investigation into the deep structures and qualities of the existing town, we developed a new vocabulary of conservation terms thus redefying the sense of PLACE. The main challenge was to create master plans that should offer a regulatory basis to the accelerated and sporadic development providing for the public good and preserving the characteristics of the PLACE consisting of a tool box of design guidelines that will have the ability to reorganize space along the time axis in a coherent way. In Conclusion: The system of rules that we have developed can generate endless possible patterns making sure that at each implementation fragment an event is created, and a better place is revealed. It takes time and perseverance but it seems to be the way to provide a healthy framework for the accelerated urbanization of our chaotic present.

Keywords: housing, architecture, urban qualities, urban regeneration, conservation, intensification

Procedia PDF Downloads 361
2814 3D Point Cloud Model Color Adjustment by Combining Terrestrial Laser Scanner and Close Range Photogrammetry Datasets

Authors: M. Pepe, S. Ackermann, L. Fregonese, C. Achille

Abstract:

3D models obtained with advanced survey techniques such as close-range photogrammetry and laser scanner are nowadays particularly appreciated in Cultural Heritage and Archaeology fields. In order to produce high quality models representing archaeological evidences and anthropological artifacts, the appearance of the model (i.e. color) beyond the geometric accuracy, is not a negligible aspect. The integration of the close-range photogrammetry survey techniques with the laser scanner is still a topic of study and research. By combining point cloud data sets of the same object generated with both technologies, or with the same technology but registered in different moment and/or natural light condition, could construct a final point cloud with accentuated color dissimilarities. In this paper, a methodology to uniform the different data sets, to improve the chromatic quality and to highlight further details by balancing the point color will be presented.

Keywords: color models, cultural heritage, laser scanner, photogrammetry

Procedia PDF Downloads 280
2813 Educational Knowledge Transfer in Indigenous Mexican Areas Using Cloud Computing

Authors: L. R. Valencia Pérez, J. M. Peña Aguilar, A. Lamadrid Álvarez, A. Pastrana Palma, H. F. Valencia Pérez, M. Vivanco Vargas

Abstract:

This work proposes a Cooperation-Competitive (Coopetitive) approach that allows coordinated work among the Secretary of Public Education (SEP), the Autonomous University of Querétaro (UAQ) and government funds from National Council for Science and Technology (CONACYT) or some other international organizations. To work on an overall knowledge transfer strategy with e-learning over the Cloud, where experts in junior high and high school education, working in multidisciplinary teams, perform analysis, evaluation, design, production, validation and knowledge transfer at large scale using a Cloud Computing platform. Allowing teachers and students to have all the information required to ensure a homologated nationally knowledge of topics such as mathematics, statistics, chemistry, history, ethics, civism, etc. This work will start with a pilot test in Spanish and initially in two regional dialects Otomí and Náhuatl. Otomí has more than 285,000 speaking indigenes in Queretaro and Mexico´s central region. Náhuatl is number one indigenous dialect spoken in Mexico with more than 1,550,000 indigenes. The phase one of the project takes into account negotiations with indigenous tribes from different regions, and the Information and Communication technologies to deliver the knowledge to the indigenous schools in their native dialect. The methodology includes the following main milestones: Identification of the indigenous areas where Otomí and Náhuatl are the spoken dialects, research with the SEP the location of actual indigenous schools, analysis and inventory or current schools conditions, negotiation with tribe chiefs, analysis of the technological communication requirements to reach the indigenous communities, identification and inventory of local teachers technology knowledge, selection of a pilot topic, analysis of actual student competence with traditional education system, identification of local translators, design of the e-learning platform, design of the multimedia resources and storage strategy for “Cloud Computing”, translation of the topic to both dialects, Indigenous teachers training, pilot test, course release, project follow up, analysis of student requirements for the new technological platform, definition of a new and improved proposal with greater reach in topics and regions. Importance of phase one of the project is multiple, it includes the proposal of a working technological scheme, focusing in the cultural impact in Mexico so that indigenous tribes can improve their knowledge about new forms of crop improvement, home storage technologies, proven home remedies for common diseases, ways of preparing foods containing major nutrients, disclose strengths and weaknesses of each region, communicating through cloud computing platforms offering regional products and opening communication spaces for inter-indigenous cultural exchange.

Keywords: Mexicans indigenous tribes, education, knowledge transfer, cloud computing, otomi, Náhuatl, language

Procedia PDF Downloads 404
2812 Early Detection of Damages in Railway Steel Truss Bridges from Measured Dynamic Responses

Authors: Dinesh Gundavaram

Abstract:

This paper presents an investigation on bridge damage detection based on the dynamic responses estimated from a passing vehicle. A numerical simulation of steel truss bridge for railway was used in this investigation. The bridge response at different locations is measured using CSI-Bridge software. Several damage scenarios are considered including different locations and severities. The possibilities of dynamic properties of global modes in the identification of structural changes in truss bridges were discussed based on the results of measurement.

Keywords: bridge, damage, dynamic responses, detection

Procedia PDF Downloads 271
2811 A Review of BIM Applications for Heritage and Historic Buildings: Challenges and Solutions

Authors: Reza Yadollahi, Arash Hejazi, Dante Savasta

Abstract:

Building Information Modeling (BIM) is growing so fast in construction projects around the world. Considering BIM's weaknesses in implementing existing heritage and historical buildings, it is critical to facilitate BIM application for such structures. One of the pieces of information to build a model in BIM is to import material and its characteristics. Material library is essential to speed up the entry of project information. To save time and prevent cost overrun, a BIM object material library should be provided. However, historical buildings' lack of information and documents is typically a challenge in renovation and retrofitting projects. Due to the lack of case documents for historic buildings, importing data is a time-consuming task, which can be improved by creating BIM libraries. Based on previous research, this paper reviews the complexities and challenges in BIM modeling for heritage, historic, and architectural buildings. Through identifying the strengths and weaknesses of the standard BIM systems, recommendations are provided to enhance the modeling platform.

Keywords: building Information modeling, historic, heritage buildings, material library

Procedia PDF Downloads 117
2810 Definition, Structure, and Core Functions of the State Image

Authors: Rosa Nurtazina, Yerkebulan Zhumashov, Maral Tomanova

Abstract:

Humanity is entering an era when 'virtual reality' as the image of the world created by the media with the help of the Internet does not match the reality in many respects, when new communication technologies create a fundamentally different and previously unknown 'global space'. According to these technologies, the state begins to change the basic technology of political communication of the state and society, the state and the state. Nowadays, image of the state becomes the most important tool and technology. Image is a purposefully created image granting political object (person, organization, country, etc.) certain social and political values and promoting more emotional perception. Political image of the state plays an important role in international relations. The success of the country's foreign policy, development of trade and economic relations with other countries depends on whether it is positive or negative. Foreign policy image has an impact on political processes taking place in the state: the negative image of the countries can be used by opposition forces as one of the arguments to criticize the government and its policies.

Keywords: image of the country, country's image classification, function of the country image, country's image components

Procedia PDF Downloads 435
2809 Review of the Software Used for 3D Volumetric Reconstruction of the Liver

Authors: P. Strakos, M. Jaros, T. Karasek, T. Kozubek, P. Vavra, T. Jonszta

Abstract:

In medical imaging, segmentation of different areas of human body like bones, organs, tissues, etc. is an important issue. Image segmentation allows isolating the object of interest for further processing that can lead for example to 3D model reconstruction of whole organs. Difficulty of this procedure varies from trivial for bones to quite difficult for organs like liver. The liver is being considered as one of the most difficult human body organ to segment. It is mainly for its complexity, shape versatility and proximity of other organs and tissues. Due to this facts usually substantial user effort has to be applied to obtain satisfactory results of the image segmentation. Process of image segmentation then deteriorates from automatic or semi-automatic to fairly manual one. In this paper, overview of selected available software applications that can handle semi-automatic image segmentation with further 3D volume reconstruction of human liver is presented. The applications are being evaluated based on the segmentation results of several consecutive DICOM images covering the abdominal area of the human body.

Keywords: image segmentation, semi-automatic, software, 3D volumetric reconstruction

Procedia PDF Downloads 290
2808 Identification and Management of Septic Arthritis of the Untouched Glenohumeral Joint

Authors: Sumit Kanwar, Manisha Chand, Gregory Gilot

Abstract:

Background: Septic arthritis of the shoulder has infrequently been discussed. Focus on infection of the untouched shoulder has not heretofore been described. We present four patients with glenohumeral septic arthritis. Methods: Case 1: A 59 year old male with left shoulder pain in the anterior, posterior and superior aspects. Case 2: A 60 year old male with fever, chills, and generalized muscle aches. Case 3: A 70 year old male with right shoulder pain about the anterior and posterior aspects. Case 4: A 55 year old male with global right shoulder pain, swelling, and limited ROM. Results: In case 1, the left shoulder was affected. Physical examination, swelling was notable, there was global tenderness with a painful range of motion (ROM). The lab values indicated an erythrocyte sedimentation rate (ESR) of 96, and a C-reactive protein (CRP) of 304.30. Imaging studies were performed and MRI indicated a high suspicion for an abscess with osteomyelitis of the humeral head. Our second case’s left arm was affected. He had swelling, global tenderness and painful ROM. His ESR was 38, CRP was 14.9. X-ray showed severe arthritis. Case 3 differed with the right arm being affected. Again, global tenderness and painful ROM was observed. His ESR was 94, and CRP was 10.6. X-ray displayed an eroded glenoid space. Our fourth case’s right shoulder was affected. He had global tenderness and painful, limited ROM. ESR was 108 and CRP was 2.4. X-ray was non-significant. Discussion: Monoarticular septic arthritis of the virgin glenohumeral joint is seldom diagnosed in clinical practice. Common denominators include elevated ESR, painful, limited ROM, and involvement of the dominant arm. The male population is more frequently affected with an average age of 57. Septic arthritis is managed with incision and drainage or needle aspiration of synovial fluid supplemented with 3-6 weeks of intravenous antibiotics. Due to better irrigation and joint visualization, arthroscopy is preferred. Open surgical drainage may be indicated if the above methods fail. Conclusion: If a middle-aged male presents with vague anterior or posterior shoulder pain, elevated inflammatory markers and a low grade fever, an x-ray should be performed. If this displays degenerative joint disease, the complete further workup with advanced imaging, such as an MRI, CT scan, or an ultrasound. If these imaging modalities display anterior space joint effusion with soft tissue involvement, we can suspect septic arthritis of the untouched glenohumeral joint and surgery is indicated.

Keywords: glenohumeral joint, identification, infection, septic arthritis, shoulder

Procedia PDF Downloads 422
2807 Numerical Modeling of Air Pollution with PM-Particles and Dust

Authors: N. Gigauri, A. Surmava, L. Intskirveli, V. Kukhalashvili, S. Mdivani

Abstract:

The subject of our study is atmospheric air pollution with numerical modeling. In the presented article, as the object of research, there is chosen city Tbilisi, the capital of Georgia, with a population of one and a half million and a difficult terrain. The main source of pollution in Tbilisi is currently vehicles and construction dust. The concentrations of dust and PM (Particulate Matter) were determined in the air of Tbilisi and in its vicinity. There are estimated their monthly maximum, minimum, and average concentrations. Processes of dust propagation in the atmosphere of the city and its surrounding territory are modelled using a 3D regional model of atmospheric processes and an admixture transfer-diffusion equation. There were taken figures of distribution of the polluted cloud and dust concentrations in different areas of the city at different heights and at different time intervals with the background stationary westward and eastward wind. It is accepted that the difficult terrain and mountain-bar circulation affect the deformation of the cloud and its spread, there are determined time periods when the dust concentration in the city is greater than MAC (Maximum Allowable Concentration, MAC=0.5 mg/m³).

Keywords: air pollution, dust, numerical modeling, PM-particles

Procedia PDF Downloads 140
2806 Integrated Target Tracking and Control for Automated Car-Following of Truck Platforms

Authors: Fadwa Alaskar, Fang-Chieh Chou, Carlos Flores, Xiao-Yun Lu, Alexandre M. Bayen

Abstract:

This article proposes a perception model for enhancing the accuracy and stability of car-following control of a longitudinally automated truck. We applied a fusion-based tracking algorithm on measurements of a single preceding vehicle needed for car-following control. This algorithm fuses two types of data, radar and LiDAR data, to obtain more accurate and robust longitudinal perception of the subject vehicle in various weather conditions. The filter’s resulting signals are fed to the gap control algorithm at every tracking loop composed by a high-level gap control and lower acceleration tracking system. Several highway tests have been performed with two trucks. The tests show accurate and fast tracking of the target, which impacts on the gap control loop positively. The experiments also show the fulfilment of control design requirements, such as fast speed variations tracking and robust time gap following.

Keywords: object tracking, perception, sensor fusion, adaptive cruise control, cooperative adaptive cruise control

Procedia PDF Downloads 229