Search results for: content-based features
2193 Renegotiating International Contract Clauses: The Case of Investment Environment Changes in Egypt
Authors: Marwa Zein
Abstract:
The long-term of the contract is one of the major features that distinguish international trade and investment contracts from other internal contracts. This is due to the nature of the contract and the huge works required to be performed from one hand or the desire of the parties to achieve stability in their transactions. However, long-term contracts might expose them to certain events and circumstances that impact the capability of the parties to execute their obligations pursuant to these contracts. During the year 2016, the Egyptian government has taken series of economic decisions which greatly impacted the economic and investment environment. Consequently, many contracts have encountered many problems in their execution due to such changes that greatly influence the performance of their obligation, a matter that necessitated the renegotiation of the conditions of these contracts on the basis of the unpredicted changes that could be listed under the Force Majeure Clause. The principle of fair and equitable treatment in investment placed on an obligation on the Egyptian government to consider the renegotiation of contract clauses based on the new conditions. This paper will discuss the idea of renegotiating international trade and investment contracts in Egypt with reference to the changes the economic environment has witnessed lately.Keywords: change of circumstances, international contracts, investment contracts, renegotiation
Procedia PDF Downloads 1952192 Science Communication: A Possible Dialogue between Researchers and Agribusiness Farmers
Authors: Cristiane Hengler Corrêa Bernardo
Abstract:
The communication is an essential part of the process that characterizes scientific research. It should be present in every stage of research in a systemic way. However, this process is not always efficient and effective. Reports of researchers focused on agribusiness point to difficulties in communicating with farmers that negatively impact on research results and may cause distortions and even quite significant inconsistencies. This research aims at identifying the main noise and barriers in communication between agribusiness researchers and farmers. It discusses the possibility of creating a specific strategy to correct or minimize such failures. The main research question: what features of the communication process will be decisive for the communication between agribusiness researcher and farmer occur with greater efficiency? It is expected that the research will result in processes that may correct or minimize such problems, promoting dialogues more efficient knowledge. The research will adopt a qualitative approach, using action research as a form of investigative action for social and educational nature, aiming at promoting understanding and interaction between researchers and members of the investigated situations. To collect and analyze data to document analysis will be used; questionnaires and interviews and content analysis.Keywords: agribusiness farmers, researchers, science communication, analysis
Procedia PDF Downloads 2742191 The Guide Presentation: The Grand Palace
Authors: Nuchanat Handumrongkul Danaya Darnsawasdi, Anantachai Aeka
Abstract:
To be a model for performing oral presentations by the tour guides, this research has been conducted. In order to develop French language teaching and studying for tourism, its purpose is to analyze the content used by tour guides. The study employed audio recordings of these presentations as an interview method in authentic situations, having four guides as respondents and information providers. The data was analyzed through content analysis. The results found that the tour guides described eight important items by giving more importance to details at Wat Phra Kaew or the Temple of the Emerald Buddha than at the palaces. They preferred the buildings upon the upper terrace, Buddhist cosmology, the decoration techniques, the royal chapel, the mural paintings, Thai offerings to Buddha images, palaces with architectural features and functions including royal ceremonies and others. This information represents the Thai characteristics of each building and other related content. The findings were used as a manual for guides for how to describe a tourist attraction, especially the temple and other related cultural topics of interest.Keywords: guide, guide presentation, Grand Palace, Buddhist cosmology
Procedia PDF Downloads 4982190 Under the ‘Fourth World’: A Discussion to the Transformation of Character-Settings in Chinese Ethnic Minority Films
Authors: Sicheng Liu
Abstract:
Based on the key issue of the current fourth world studies, the article aims to analyze the features of character-settings in Chinese ethnic minority films. As a generalizable transformation, this feature progresses from a microcosmic representation. It argues that, as the mediation, films note down the current state of people and their surroundings, while the ‘fourth world’ theorization (or the fourth cinema) provides a new perspective to ethnic minority topics in China. Like the ‘fourth cinema’ focusing on the depiction of indigeneity groups, the ethnic minority films portrait the non-Han nationalities in China. Both types possess the motif of returning history-writing to the minority members’ own hand. In this article, the discussion entirely involves three types of cinematic role-settings in Chinese minority themed films, which illustrates that, similar to the creative principle of the fourth film, the themes and narratives of these films are becoming more individualized, with more concern to minority grassroots.Keywords: 'fourth world', Chinese ethnic minority films, ethnicity and culture reflection, 'mother tongue' (muyu), highlighting to individual spiritual
Procedia PDF Downloads 1862189 Pod and Wavelets Application for Aerodynamic Design Optimization
Authors: Bonchan Koo, Junhee Han, Dohyung Lee
Abstract:
The research attempts to evaluate the accuracy and efficiency of a design optimization procedure which combines wavelets-based solution algorithm and proper orthogonal decomposition (POD) database management technique. Aerodynamic design procedure calls for high fidelity computational fluid dynamic (CFD) simulations and the consideration of large number of flow conditions and design constraints. Even with significant computing power advancement, current level of integrated design process requires substantial computing time and resources. POD reduces the degree of freedom of full system through conducting singular value decomposition for various field simulations. For additional efficiency improvement of the procedure, adaptive wavelet technique is also being employed during POD training period. The proposed design procedure was applied to the optimization of wing aerodynamic performance. Throughout the research, it was confirmed that the POD/wavelets design procedure could significantly reduce the total design turnaround time and is also able to capture all detailed complex flow features as in full order analysis.Keywords: POD (Proper Orthogonal Decomposition), wavelets, CFD, design optimization, ROM (Reduced Order Model)
Procedia PDF Downloads 4632188 Optimal Feature Extraction Dimension in Finger Vein Recognition Using Kernel Principal Component Analysis
Authors: Amir Hajian, Sepehr Damavandinejadmonfared
Abstract:
In this paper the issue of dimensionality reduction is investigated in finger vein recognition systems using kernel Principal Component Analysis (KPCA). One aspect of KPCA is to find the most appropriate kernel function on finger vein recognition as there are several kernel functions which can be used within PCA-based algorithms. In this paper, however, another side of PCA-based algorithms -particularly KPCA- is investigated. The aspect of dimension of feature vector in PCA-based algorithms is of importance especially when it comes to the real-world applications and usage of such algorithms. It means that a fixed dimension of feature vector has to be set to reduce the dimension of the input and output data and extract the features from them. Then a classifier is performed to classify the data and make the final decision. We analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in this paper and investigate the optimal feature extraction dimension in finger vein recognition using KPCA.Keywords: biometrics, finger vein recognition, principal component analysis (PCA), kernel principal component analysis (KPCA)
Procedia PDF Downloads 3632187 Connecting Students and Faculty Research Efforts through the Research and Projects Portal
Authors: Havish Nalapareddy, Mark V. Albert, Ranak Bansal, Avi Udash, Lin Lin
Abstract:
Students engage in many course projects during their degree programs. However, impactful projects often need a time frame longer than a single semester. Ideally, projects are documented and structured to be readily accessible to future students who may choose to continue the project, with features that emphasize the local community, university, or course structure. The Research and Project Portal (RAPP) is a place where students can post both their completed and ongoing projects with all the resources and tools used. This portal allows students to see what other students have done in the past, in the same university environment, related to their domain of interest. Computer science instructors or students selecting projects can use this portal to assign or choose an incomplete project. Additionally, this portal allows non-computer science faculty and industry collaborators to document their project ideas for students in courses to prototype directly, rather than directly soliciting the help of instructors in engaging students. RAPP serves as a platform linking students across classes and faculty both in and out of computer science courses on joint projects to encourage long-term project efforts across semesters or years.Keywords: education, technology, research, academic portal
Procedia PDF Downloads 1362186 Deep Learning Based Polarimetric SAR Images Restoration
Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo ferraioli
Abstract:
In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring . SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.Keywords: SAR image, deep learning, convolutional neural network, deep neural network, SAR polarimetry
Procedia PDF Downloads 892185 Impacts of Building Design Factors on Auckland School Energy Consumptions
Authors: Bin Su
Abstract:
This study focuses on the impact of school building design factors on winter extra energy consumption which mainly includes space heating, water heating and other appliances related to winter indoor thermal conditions. A number of Auckland schools were randomly selected for the study which introduces a method of using real monthly energy consumption data for a year to calculate winter extra energy data of school buildings. The study seeks to identify the relationships between winter extra energy data related to school building design data related to the main architectural features, building envelope and elements of the sample schools. The relationships can be used to estimate the approximate saving in winter extra energy consumption which would result from a changed design datum for future school development, and identify any major energy-efficient design problems. The relationships are also valuable for developing passive design guides for school energy efficiency.Keywords: building energy efficiency, building thermal design, building thermal performance, school building design
Procedia PDF Downloads 4402184 Using AI Based Software as an Assessment Aid for University Engineering Assignments
Authors: Waleed Al-Nuaimy, Luke Anastassiou, Manjinder Kainth
Abstract:
As the process of teaching has evolved with the advent of new technologies over the ages, so has the process of learning. Educators have perpetually found themselves on the lookout for new technology-enhanced methods of teaching in order to increase learning efficiency and decrease ever expanding workloads. Shortly after the invention of the internet, web-based learning started to pick up in the late 1990s and educators quickly found that the process of providing learning material and marking assignments could change thanks to the connectivity offered by the internet. With the creation of early web-based virtual learning environments (VLEs) such as SPIDER and Blackboard, it soon became apparent that VLEs resulted in higher reported computer self-efficacy among students, but at the cost of students being less satisfied with the learning process . It may be argued that the impersonal nature of VLEs, and their limited functionality may have been the leading factors contributing to this reported dissatisfaction. To this day, often faced with the prospects of assigning colossal engineering cohorts their homework and assessments, educators may frequently choose optimally curated assessment formats, such as multiple-choice quizzes and numerical answer input boxes, so that automated grading software embedded in the VLEs can save time and mark student submissions instantaneously. A crucial skill that is meant to be learnt during most science and engineering undergraduate degrees is gaining the confidence in using, solving and deriving mathematical equations. Equations underpin a significant portion of the topics taught in many STEM subjects, and it is in homework assignments and assessments that this understanding is tested. It is not hard to see that this can become challenging if the majority of assignment formats students are engaging with are multiple-choice questions, and educators end up with a reduced perspective of their students’ ability to manipulate equations. Artificial intelligence (AI) has in recent times been shown to be an important consideration for many technologies. In our paper, we explore the use of new AI based software designed to work in conjunction with current VLEs. Using our experience with the software, we discuss its potential to solve a selection of problems ranging from impersonality to the reduction of educator workloads by speeding up the marking process. We examine the software’s potential to increase learning efficiency through its features which claim to allow more customized and higher-quality feedback. We investigate the usability of features allowing students to input equation derivations in a range of different forms, and discuss relevant observations associated with these input methods. Furthermore, we make ethical considerations and discuss potential drawbacks to the software, including the extent to which optical character recognition (OCR) could play a part in the perpetuation of errors and create disagreements between student intent and their submitted assignment answers. It is the intention of the authors that this study will be useful as an example of the implementation of AI in a practical assessment scenario insofar as serving as a springboard for further considerations and studies that utilise AI in the setting and marking of science and engineering assignments.Keywords: engineering education, assessment, artificial intelligence, optical character recognition (OCR)
Procedia PDF Downloads 1212183 Psychosocial Predictors of Brand Loyalty in Pakistani Consumers
Authors: Muhammad Sulman, Tabinda Khurshid, Afsheen Masood
Abstract:
The current research focused on determining the factors that determine the brand loyalty in consumers. It was hypothesized that there are certain demographical features that lead the consumers to adhere more towards certain brands. Cross-sectional research design was used. The sample for the current research comprised of participants (N=500) from age group 16 to 55 years. The data was collected through self-constructed demographic questionnaire as well as from a self-constructed Brand Loyalty Questionnaire. Brand Loyalty Questionnaire was adapted after taking permission from researchers. A pilot study was conducted to chalk out all the ambiguities of the questionnaire. The final version was administered on 250 participants. The descriptive and inferential analyses were carried on through SPSS version 24.00 to explore the factors that determine Brand Loyalty. The findings revealed that there is a relationship between brand loyalty and brand loyalty demographics and certain factors emerged as significant predictors of brand loyalty in young and middle aged consumers. The research findings carry strong implications for organizational and consumer psychologists in particular and for professionals in marketing and policy making in general.Keywords: consumers, consumer psychologists, marketing, organizational, policy making
Procedia PDF Downloads 2702182 Genistein Treatment Confers Protection Against Gliopathy & Vasculopathy of the Diabetic Retina in Rats
Authors: Sanaa AM Elgayar, Sohair A Eltony, Maha Mahmoud Abd El Rouf
Abstract:
Background: Retinopathy remains an important complication of diabetes. Aim of work: This work was carried out to evaluate the protective effects of genistein from diabetic retinopathy in rat. Material and Methods: Fifteen adult male albino rats were divided into two groups; Group I: control (n=5) and Group II: streptozotocin induced diabetic group (n=10), which is equally divided into two subgroups; IIa (diabetic vehicle control) and IIb (diabetic genistein-treated). Specimens were taken from the retina 12 weeks post induction, processed and examined using light, immunohistochemical, ultrastructural techniques. Blood samples were assayed for the levels of glucose. Results: In comparison with the diabetic non-treated group, the histological changes in macro and microglial glial cells reactivity and retinal blood capillaries were improved in genistein-treated groups. In addition, GFAP and iNOS expressions in the retina and the blood glucose level were reduced. Conclusion: Genistein ameliorates the histological changes of diabetic retinopathy reaching healing features, which resemble that of a normal retina.Keywords: diabetic retinopathy, genistein, glia, capillaries.
Procedia PDF Downloads 3142181 Random Analysis of Physical and Mechanical Characteristics of Superfine Animal Fibres
Authors: Sepehr Moradi
Abstract:
The physical and mechanical property parameters, inter-relation of key dimensional and distribution profile of raw Australia Superfine Merino Wool (ASFW) and Inner Mongolia Cashmere (IMC) fibres have been studied. The relationship between the properties of these fibres is assessed using fit transformation functions obtained through correlation coefficient analysis. ASFW and IMC fibre properties are found to be both positively skewed and asymmetric in nature. Whilst fibre diameter varies along its length and both ends have a tapering shape. The basic physical features, namely linear density, true local diameter, true length and breaking load are positively correlated while their tenacity is negatively correlated. The tenacity and true length follow a second order polynomial while the true local diameter is linearly correlated. Assessment of the diameter and length is sufficient to estimate the evaluation of quality for commercial grade ASFW and IMC fibres.Keywords: Australia Superfine Merino Wool fibre, Inner Mongolia Cashmere fibre, distribution profile, physical properties
Procedia PDF Downloads 1552180 A New Approach towards the Development of Next Generation CNC
Authors: Yusri Yusof, Kamran Latif
Abstract:
Computer Numeric Control (CNC) machine has been widely used in the industries since its inception. Currently, in CNC technology has been used for various operations like milling, drilling, packing and welding etc. with the rapid growth in the manufacturing world the demand of flexibility in the CNC machines has rapidly increased. Previously, the commercial CNC failed to provide flexibility because its structure was of closed nature that does not provide access to the inner features of CNC. Also CNC’s operating ISO data interface model was found to be limited. Therefore, to overcome that problem, Open Architecture Control (OAC) technology and STEP-NC data interface model are introduced. At present the Personal Computer (PC) has been the best platform for the development of open-CNC systems. In this paper, both ISO data interface model interpretation, its verification and execution has been highlighted with the introduction of the new techniques. The proposed is composed of ISO data interpretation, 3D simulation and machine motion control modules. The system is tested on an old 3 axis CNC milling machine. The results are found to be satisfactory in performance. This implementation has successfully enabled sustainable manufacturing environment.Keywords: CNC, ISO 6983, ISO 14649, LabVIEW, open architecture control, reconfigurable manufacturing systems, sustainable manufacturing, Soft-CNC
Procedia PDF Downloads 5132179 Effect of the Nature of the Precursor on the Performance of Cu-Mn Catalysts for CO and VOCs Oxidation
Authors: Elitsa Kolentsova, Dimitar Dimitrov, Krasimir Ivanov
Abstract:
The catalytic oxidation of methanol to formaldehyde is an important industrial process in which the waste gas in addition to CO contains methanol and dimethyl ether (DME). Evaluation of the possibility of removing the harmful components from the exhaust gasses needs a more complex investigation. Our previous work indicates that supported Cu-Mn oxide catalysts are promising for effective deep oxidation of these compounds. This work relates to the catalyst, comprising copper-manganese spinel, coated on carrier γ-Al₂O₃. The effect of preparation conditions on the active component composition and activity behavior of the catalysts is discussed. Different organometallic compounds on the base of four natural amino acids (Glycine, Alanine, Valine, Leucine) as precursors were used for the preparation of catalysts with Cu/Mn molar ratio 1:5. X-Ray and TEM analysis were performed on the catalyst’s bulk, and surface composition and the specific surface area was determined by BET method. The results obtained show that the activity of the catalysts increase up to 40% although there are some specific features, depending on the nature of the amino acid and the oxidized compound.Keywords: Cu-Mn/γ-Al₂O₃, CO and VOCs oxidation, heterogeneous catalysis, amino acids
Procedia PDF Downloads 2392178 Vortices Structure in Internal Laminar and Turbulent Flows
Authors: Farid Gaci, Zoubir Nemouchi
Abstract:
A numerical study of laminar and turbulent fluid flows in 90° bend of square section was carried out. Three-dimensional meshes, based on hexahedral cells, were generated. The QUICK scheme was employed to discretize the convective term in the transport equations. The SIMPLE algorithm was adopted to treat the velocity-pressure coupling. The flow structure obtained showed interesting features such as recirculation zones and counter-rotating pairs of vortices. The performance of three different turbulence models was evaluated: the standard k- ω model, the SST k-ω model and the Reynolds Stress Model (RSM). Overall, it was found that, the multi-equation model performed better than the two equation models. In fact, the existence of four pairs of counter rotating cells, in the straight duct upstream of the bend, were predicted by the RSM closure but not by the standard eddy viscosity model nor the SST k-ω model. The analysis of the results led to a better understanding of the induced three dimensional secondary flows and the behavior of the local pressure coefficient and the friction coefficient.Keywords: curved duct, counter-rotating cells, secondary flow, laminar, turbulent
Procedia PDF Downloads 3362177 Effectiveness of Powerpoint Presentations in Teaching Anatomy: A Student's Perspective
Authors: Vrinda Hari Ankolekar
Abstract:
Introduction: The advancement of various audio-visual aids in the present era has led to progressive changes in education. Use of powerpoint presentations play a key role in anatomy to learn and understand a particular topic. As the subject of anatomy involves more of illustrations and demonstrations, powerpoint presentations become essential in conveying the necessary information. Objectives: To assess the students’ perspective about the use of powerpoint presentations in teaching anatomy.Method: A questionnaire was constructed and 55 students were asked to put forth their preferences for the powerpoint presentations or blackboard that would help them to understand the subject better. Results and conclusion: 30 voted PPT as better and effective tool to explain the subject efficiently. 35 chose PPT as more creative than Blackboard to create interest in the subject. 20 wanted to retain chalk and talk for teaching their subject instead of replacing it with PowerPoint. 36 felt chalk and talk as more useful and appropriate tool for teaching than PowerPoint. Only 25 felt chalk and talk relatively more boring than PowerPoint. 23 experienced more involvement and active participation in the class when chalk and talk is used as the teaching tool. 26 stated that chalk and talk has most of the features needed for teaching.One of the limitations of this study is that the sample size is drawn from one institution only and deals with the experience of one particular group of individuals.Keywords: chalk and board, powerpoint presentation, presentation skills, teaching technologies
Procedia PDF Downloads 4082176 Evaluation of Fusion Sonar and Stereo Camera System for 3D Reconstruction of Underwater Archaeological Object
Authors: Yadpiroon Onmek, Jean Triboulet, Sebastien Druon, Bruno Jouvencel
Abstract:
The objective of this paper is to develop the 3D underwater reconstruction of archaeology object, which is based on the fusion between a sonar system and stereo camera system. The underwater images are obtained from a calibrated camera system. The multiples image pairs are input, and we first solve the problem of image processing by applying the well-known filter, therefore to improve the quality of underwater images. The features of interest between image pairs are selected by well-known methods: a FAST detector and FLANN descriptor. Subsequently, the RANSAC method is applied to reject outlier points. The putative inliers are matched by triangulation to produce the local sparse point clouds in 3D space, using a pinhole camera model and Euclidean distance estimation. The SFM technique is used to carry out the global sparse point clouds. Finally, the ICP method is used to fusion the sonar information with the stereo model. The final 3D models have a précised by measurement comparing with the real object.Keywords: 3D reconstruction, archaeology, fusion, stereo system, sonar system, underwater
Procedia PDF Downloads 2982175 Identifying Degradation Patterns of LI-Ion Batteries from Impedance Spectroscopy Using Machine Learning
Authors: Yunwei Zhang, Qiaochu Tang, Yao Zhang, Jiabin Wang, Ulrich Stimming, Alpha Lee
Abstract:
Forecasting the state of health and remaining useful life of Li-ion batteries is an unsolved challenge that limits technologies such as consumer electronics and electric vehicles. Here we build an accurate battery forecasting system by combining electrochemical impedance spectroscopy (EIS) -- a real-time, non-invasive and information-rich measurement that is hitherto underused in battery diagnosis -- with Gaussian process machine learning. We collect over 20,000 EIS spectra of commercial Li-ion batteries at different states of health, states of charge and temperatures -- the largest dataset to our knowledge of its kind. Our Gaussian process model takes the entire spectrum as input, without further feature engineering, and automatically determines which spectral features predict degradation. Our model accurately predicts the remaining useful life, even without complete knowledge of past operating conditions of the battery. Our results demonstrate the value of EIS signals in battery management systems.Keywords: battery degradation, machine learning method, electrochemical impedance spectroscopy, battery diagnosis
Procedia PDF Downloads 1472174 Biomorphological Characteristics, Habitats, Role in Plant Communities and Raw Reserves of Ayuga Turkestanica (Regel) Briq. (Lamiaceae) In Uzbekistan
Authors: Akmal E. Egamberdiev, Alim M. Nigmatullaev, Trobjon Kh. Makhkamov
Abstract:
The results of scientific research on the biomorphological features of Ajuga turkestanica (Regel) Brig., its role in plant communities, modern distribution areas, and raw material reserves are presented. Plant ontogeny is divided into 3 periods and 9 growth stages. Information on its seasonal and diurnal flowering and seed productivity is provided. As a result of the research, the participation of the studied species in plant communities, its place, the structure and floristic composition of communities were determined, and as a result, for the first time, the description of 11 new associations in 7 formations of Ajuga turkestanica, and a schematic map of the geolocation of formations and associations of plants in Uzbekistan is given. A. turkestanica (within the range) are divided into 3 categories and 21 massifs. Its current biological reserve is 93.5±35.3 tons, its usable reserve is 46.2±13.8 tons, and the reserve that can be prepared in 1 year is 28.4±5.42 tons.Keywords: ontogeny, seed productivity, seasonal flowering, formation, association, dominant, subdominant, areal, biological reserve, operational reserve, annual reserve, GIS map
Procedia PDF Downloads 942173 Natural Preservatives: An Alternative for Chemical Preservative Used in Foods
Authors: Zerrin Erginkaya, Gözde Konuray
Abstract:
Microbial degradation of foods is defined as a decrease of food safety due to microorganism activity. Organic acids, sulfur dioxide, sulfide, nitrate, nitrite, dimethyl dicarbonate and several preservative gases have been used as chemical preservatives in foods as well as natural preservatives which are indigenous in foods. It is determined that usage of herbal preservatives such as blueberry, dried grape, prune, garlic, mustard, spices inhibited several microorganisms. Moreover, it is determined that animal origin preservatives such as whey, honey, lysosomes of duck egg and chicken egg, chitosan have antimicrobial effect. Other than indigenous antimicrobials in foods, antimicrobial agents produced by microorganisms could be used as natural preservatives. The antimicrobial feature of preservatives depends on the antimicrobial spectrum, chemical and physical features of material, concentration, mode of action, components of food, process conditions, and pH and storage temperature. In this review, studies about antimicrobial components which are indigenous in food (such as herbal and animal origin antimicrobial agents), antimicrobial materials synthesized by microorganisms, and their usage as an antimicrobial agent to preserve foods are discussed.Keywords: animal origin preservatives, antimicrobial, chemical preservatives, herbal preservatives
Procedia PDF Downloads 3752172 The Restoration of the Old District in the Urbanization: The Case Study of Samsen Riverside Community, Dusit District, Bangkok
Authors: Tikhanporn Punluekdej, Saowapa Phaithayawat
Abstract:
The objectives of this research are: 1) to discover the mechanism in the restoration process of the old district, and 2) to study the people participation in the community with related units. This research utilizes qualitative research method together with the tools used in the study of historical and anthropological disciplines. The research revealed that the restoration process of the old district started with the needs of the local people in the community. These people are considered as a young generation in the community. The leading group of the community played a vital role in the restoration process by igniting the whole idea and followed by the help from those who have lived in the area of more than fifty years. The restoration process is the genuine desire of the local people without the intervention of the local politics. The core group would coordinate with the related units in which there were, for instance, the academic institutions in order to find out the most dominant historical features of the community including its settlement. The Crown Property Bureau, as the sole-owner of the land, joined the restoration in the physical development dimension. The restoration was possible due to the cooperation between local people and related units, under the designated plans, budget, and social activities.Keywords: restoration, urban area, old district, people participation
Procedia PDF Downloads 4112171 The Convolution Recurrent Network of Using Residual LSTM to Process the Output of the Downsampling for Monaural Speech Enhancement
Authors: Shibo Wei, Ting Jiang
Abstract:
Convolutional-recurrent neural networks (CRN) have achieved much success recently in the speech enhancement field. The common processing method is to use the convolution layer to compress the feature space by multiple upsampling and then model the compressed features with the LSTM layer. At last, the enhanced speech is obtained by deconvolution operation to integrate the global information of the speech sequence. However, the feature space compression process may cause the loss of information, so we propose to model the upsampling result of each step with the residual LSTM layer, then join it with the output of the deconvolution layer and input them to the next deconvolution layer, by this way, we want to integrate the global information of speech sequence better. The experimental results show the network model (RES-CRN) we introduce can achieve better performance than LSTM without residual and overlaying LSTM simply in the original CRN in terms of scale-invariant signal-to-distortion ratio (SI-SNR), speech quality (PESQ), and intelligibility (STOI).Keywords: convolutional-recurrent neural networks, speech enhancement, residual LSTM, SI-SNR
Procedia PDF Downloads 1982170 Tidal Current Behaviors and Remarkable Bathymetric Change in the South-Western Part of Khor Abdullah, Kuwait
Authors: Ahmed M. Al-Hasem
Abstract:
A study of the tidal current behavior and bathymetric changes was undertaken in order to establish an information base for future coastal management. The average velocity for tidal current was 0.46 m/s and the maximum velocity was 1.08 m/s during ebb tide. During spring tides, maximum velocities range from 0.90 m/s to 1.08 m/s, whereas maximum velocities vary from 0.40 m/s to 0.60 m/s during neap tides. Despite greater current velocities during flood tide, the bathymetric features enhance the dominance of the ebb tide. This can be related to the abundance of fine sediments from the ebb current approaching the study area, and the relatively coarser sediment from the approaching flood current. Significant bathymetric changes for the period from 1985 to 1998 were found with dominance of erosion process. Approximately 96.5% of depth changes occurred within the depth change classes of -5 m to 5 m. The high erosion processes within the study area will subsequently result in high accretion processes, particularly in the north, the location of the proposed Boubyan Port and its navigation channel.Keywords: bathymetric change, Boubyan island, GIS, Khor Abdullah, tidal current behavior
Procedia PDF Downloads 2872169 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing
Authors: Arjun Kumar Rath, Titus Dhanasingh
Abstract:
Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.Keywords: board diagnostics software, embedded system, hardware testing, test frameworks
Procedia PDF Downloads 1432168 Optimizing Machine Learning Through Python Based Image Processing Techniques
Authors: Srinidhi. A, Naveed Ahmed, Twinkle Hareendran, Vriksha Prakash
Abstract:
This work reviews some of the advanced image processing techniques for deep learning applications. Object detection by template matching, image denoising, edge detection, and super-resolution modelling are but a few of the tasks. The paper looks in into great detail, given that such tasks are crucial preprocessing steps that increase the quality and usability of image datasets in subsequent deep learning tasks. We review some of the methods for the assessment of image quality, more specifically sharpness, which is crucial to ensure a robust performance of models. Further, we will discuss the development of deep learning models specific to facial emotion detection, age classification, and gender classification, which essentially includes the preprocessing techniques interrelated with model performance. Conclusions from this study pinpoint the best practices in the preparation of image datasets, targeting the best trade-off between computational efficiency and retaining important image features critical for effective training of deep learning models.Keywords: image processing, machine learning applications, template matching, emotion detection
Procedia PDF Downloads 122167 Reconsidering Taylor’s Law with Chaotic Population Dynamical Systems
Authors: Yuzuru Mitsui, Takashi Ikegami
Abstract:
The exponents of Taylor’s law in deterministic chaotic systems are computed, and their meanings are intensively discussed. Taylor’s law is the scaling relationship between the mean and variance (in both space and time) of population abundance, and this law is known to hold in a variety of ecological time series. The exponents found in the temporal Taylor’s law are different from those of the spatial Taylor’s law. The temporal Taylor’s law is calculated on the time series from the same locations (or the same initial states) of different temporal phases. However, with the spatial Taylor’s law, the mean and variance are calculated from the same temporal phase sampled from different places. Most previous studies were done with stochastic models, but we computed the temporal and spatial Taylor’s law in deterministic systems. The temporal Taylor’s law evaluated using the same initial state, and the spatial Taylor’s law was evaluated using the ensemble average and variance. There were two main discoveries from this work. First, it is often stated that deterministic systems tend to have the value two for Taylor’s exponent. However, most of the calculated exponents here were not two. Second, we investigated the relationships between chaotic features measured by the Lyapunov exponent, the correlation dimension, and other indexes with Taylor’s exponents. No strong correlations were found; however, there is some relationship in the same model, but with different parameter values, and we will discuss the meaning of those results at the end of this paper.Keywords: chaos, density effect, population dynamics, Taylor’s law
Procedia PDF Downloads 1732166 CompleX-Machine: An Automated Testing Tool Using X-Machine Theory
Authors: E. K. A. Ogunshile
Abstract:
This paper is aimed at creating an Automatic Java X-Machine testing tool for software development. The nature of software development is changing; thus, the type of software testing tools required is also changing. Software is growing increasingly complex and, in part due to commercial impetus for faster software releases with new features and value, increasingly in danger of containing faults. These faults can incur huge cost for software development organisations and users; Cambridge Judge Business School’s research estimated the cost of software bugs to the global economy is $312 billion. Beyond the cost, faster software development methodologies and increasing expectations on developers to become testers is driving demand for faster, automated, and effective tools to prevent potential faults as early as possible in the software development lifecycle. Using X-Machine theory, this paper will explore a new tool to address software complexity, changing expectations on developers, faster development pressures and methodologies, with a view to reducing the huge cost of fixing software bugs.Keywords: conformance testing, finite state machine, software testing, x-machine
Procedia PDF Downloads 2682165 Operating System Support for Mobile Device Thermal Management and Performance Optimization in Augmented Reality Applications
Authors: Yasith Mindula Saipath Wickramasinghe
Abstract:
Augmented reality applications require a high processing power to load, render and live stream high-definition AR models and virtual scenes; it also requires device sensors to work excessively to coordinate with internal hardware, OS and give the expected outcome in advance features like object detection, real time tracking, as well as voice and text recognition. Excessive thermal generation due to these advanced functionalities has become a major research problem as it is unbearable for smaller mobile devices to manage such heat increment and battery drainage as it causes physical harm to the devices in the long term. Therefore, effective thermal management is one of the major requirements in Augmented Reality application development. As this paper discusses major causes for this issue, it also provides possible solutions in the means of operating system adaptations as well as further research on best coding practises to optimize the application performance that reduces thermal excessive thermal generation.Keywords: augmented reality, device thermal management, GPU, operating systems, device I/O, overheating
Procedia PDF Downloads 1162164 Predictive Analytics for Theory Building
Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim
Abstract:
Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building
Procedia PDF Downloads 275