Search results for: export trade data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25190

Search results for: export trade data

24140 Surface Sterilization Retain Postharvest Quality and Shelf Life of Strawberry and Cherry Tomato during Modified Atmosphere Packaging

Authors: Ju Young Kim, Mohammad Zahirul Islam, Mahmuda Akter Mele, Su Jeong Han, Hyuk Sung Yoon, In-Lee Choi, Ho-Min Kang

Abstract:

Strawberry and tomato fruits were harvested at the red ripens maturity stage in the Republic of Korea. The fruits were dipped in fungi solution and afterwards were sterilized with sodium hypochlorite (NaOCl) and chlorine dioxide (ClO2) gas. Some fruits were dipped in 150μL/L NaOCl solution for 10 minutes, and others were treated with 5μL/L ClO2 gas for 12 hours and packed with 20,000 cc OTR (oxygen transmission rate) film, the rest were packed in 10,000 cc OTR film inserted with 5μL/L ClO2 gas. 5μL/L ClO2 gas insert treatment showed the lowest carbon dioxide and ethylene, and the highest oxygen concentration was on the final storage day (15th day) in both strawberry and tomato fruits. Tomato fruits showed the lowest fresh weight loss in 5μL/L ClO2 gas insert treatment. The visual quality as well as shelf life showed the highest in 5μL/L ClO2 gas insert treatment of both strawberry and tomato fruits. In addition, the fungal incidence of strawberry and tomato fruits were the most suppressed in 5μL/L ClO2 gas insert treatment. 5μL/L ClO2 gas insert treatment showed higher firmness and soluble solids in both strawberry and tomato fruits. So, 5μL/L ClO2 gas insert treatment may be useful to prevent the fungal incidence as well as retaining the postharvest quality, and increase the shelf life of strawberry and tomato fruits for long term storage. This study was supported by Export Promotion Technology Development Program (314027-03), IPET, Ministry of Agriculture, Food and Rural Affairs, Republic of Korea.

Keywords: chlorine dioxide, ethylene, fungi, sodium hypochlorite

Procedia PDF Downloads 352
24139 Analyzing Test Data Generation Techniques Using Evolutionary Algorithms

Authors: Arslan Ellahi, Syed Amjad Hussain

Abstract:

Software Testing is a vital process in software development life cycle. We can attain the quality of software after passing it through software testing phase. We have tried to find out automatic test data generation techniques that are a key research area of software testing to achieve test automation that can eventually decrease testing time. In this paper, we review some of the approaches presented in the literature which use evolutionary search based algorithms like Genetic Algorithm, Particle Swarm Optimization (PSO), etc. to validate the test data generation process. We also look into the quality of test data generation which increases or decreases the efficiency of testing. We have proposed test data generation techniques for model-based testing. We have worked on tuning and fitness function of PSO algorithm.

Keywords: search based, evolutionary algorithm, particle swarm optimization, genetic algorithm, test data generation

Procedia PDF Downloads 172
24138 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential

Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag

Abstract:

Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.

Keywords: climate, reanalysis, renewable energy, solar radiation

Procedia PDF Downloads 194
24137 Data Mining Spatial: Unsupervised Classification of Geographic Data

Authors: Chahrazed Zouaoui

Abstract:

In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.

Keywords: mining, GIS, geo-clustering, neighborhood

Procedia PDF Downloads 364
24136 Financial Development and Economic Growth of Sub-Saharan Africa Using System GMM Analysis

Authors: Temesgen Yaekob Ergano, Sure Pulla Rao

Abstract:

The study on financial development and economic growth in Sub-Saharan Africa utilizes System GMM analysis to investigate the relationship between financial development indicators and economic performance in the region. The research findings reveal significant impacts of various financial indicators on economic growth, such as the positive influence of bank liquid reserves to bank assets ratio (R/A), trade openness, and the broad money to total reserves ratio (M/R) on the economic growth of Sub-Saharan Africa. Additionally, the study highlights the negative impact of domestic credit provided to the private sector by banks (D_bank) on economic growth, emphasizing the importance of prudent credit allocation to avoid over-indebtedness and financial crises. These results provide valuable insights for policymakers aiming to foster sustainable economic growth in the region by leveraging financial development effectively.

Keywords: financial development, economic growth, Sub-Saharan Africa, system GMM analysis, financial indicators.

Procedia PDF Downloads 33
24135 Analysis and Prediction of Netflix Viewing History Using Netflixlatte as an Enriched Real Data Pool

Authors: Amir Mabhout, Toktam Ghafarian, Amirhossein Farzin, Zahra Makki, Sajjad Alizadeh, Amirhossein Ghavi

Abstract:

The high number of Netflix subscribers makes it attractive for data scientists to extract valuable knowledge from the viewers' behavioural analyses. This paper presents a set of statistical insights into viewers' viewing history. After that, a deep learning model is used to predict the future watching behaviour of the users based on previous watching history within the Netflixlatte data pool. Netflixlatte in an aggregated and anonymized data pool of 320 Netflix viewers with a length 250 000 data points recorded between 2008-2022. We observe insightful correlations between the distribution of viewing time and the COVID-19 pandemic outbreak. The presented deep learning model predicts future movie and TV series viewing habits with an average loss of 0.175.

Keywords: data analysis, deep learning, LSTM neural network, netflix

Procedia PDF Downloads 217
24134 Analysis of User Data Usage Trends on Cellular and Wi-Fi Networks

Authors: Jayesh M. Patel, Bharat P. Modi

Abstract:

The availability of on mobile devices that can invoke the demonstrated that the total data demand from users is far higher than previously articulated by measurements based solely on a cellular-centric view of smart-phone usage. The ratio of Wi-Fi to cellular traffic varies significantly between countries, This paper is shown the compression between the cellular data usage and Wi-Fi data usage by the user. This strategy helps operators to understand the growing importance and application of yield management strategies designed to squeeze maximum returns from their investments into the networks and devices that enable the mobile data ecosystem. The transition from unlimited data plans towards tiered pricing and, in the future, towards more value-centric pricing offers significant revenue upside potential for mobile operators, but, without a complete insight into all aspects of smartphone customer behavior, operators will unlikely be able to capture the maximum return from this billion-dollar market opportunity.

Keywords: cellular, Wi-Fi, mobile, smart phone

Procedia PDF Downloads 348
24133 Data Driven Infrastructure Planning for Offshore Wind farms

Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree

Abstract:

The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.

Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data

Procedia PDF Downloads 63
24132 Application on Metastable Measurement with Wide Range High Resolution VDL Circuit

Authors: Po-Hui Yang, Jing-Min Chen, Po-Yu Kuo, Chia-Chun Wu

Abstract:

This paper proposed a high resolution Vernier Delay Line (VDL) measurement circuit with coarse and fine detection mechanism, which improved the trade-off problem between high resolution and less delay cells in traditional VDL circuits. And the measuring time of proposed measurement circuit is also under the high resolution requests. At first, the testing range of input signal which proposed high resolution delay line is detected by coarse detection VDL. Moreover, the delayed input signal is transmitted to fine detection VDL for measuring value with better accuracy. This paper is implemented at 0.18μm process, operating frequency is 100 MHz, and the resolution achieved 2.0 ps with only 16-stage delay cells. The test range is 170ps wide, and 17% stages saved compare with traditional single delay line circuit.

Keywords: vernier delay line, D-type flip-flop, DFF, metastable phenomenon

Procedia PDF Downloads 583
24131 Empirical Acceleration Functions and Fuzzy Information

Authors: Muhammad Shafiq

Abstract:

In accelerated life testing approaches life time data is obtained under various conditions which are considered more severe than usual condition. Classical techniques are based on obtained precise measurements, and used to model variation among the observations. In fact, there are two types of uncertainty in data: variation among the observations and the fuzziness. Analysis techniques, which do not consider fuzziness and are only based on precise life time observations, lead to pseudo results. This study was aimed to examine the behavior of empirical acceleration functions using fuzzy lifetimes data. The results showed an increased fuzziness in the transformed life times as compare to the input data.

Keywords: acceleration function, accelerated life testing, fuzzy number, non-precise data

Procedia PDF Downloads 281
24130 Protection towards Investor: Enforcement of the Authorities of Indonesian Financial Services Authority (OJK) during Capital Market Integration

Authors: Muhammad Ilham Agus Salim, Muhammad Ikbal

Abstract:

The ASEAN Economic Community (AEC) was set up in 2003 with the objectives of creating a single market and production base, enhancing equitable economic development as well as facilitating the integration into the global economy. The AEC involves liberalization and facilitation of trade in goods, skilled labour, services, and investment, as well as protection and promotion of investment. The thesis outlines the AEC Blueprint actions in scope of globalization of investment and capital market. Free flows of investment and freer flows of capital market urge countries in South East Asia to coordinate and to collaborate in securing the interest of public, and this leads to the importance of financial services authorities in ASEAN to prepare the mechanism of guarding the flows of investment. There is no exception, especially for Indonesian Financial Services Authority (OJK) as one of the authorized body in capital market supervision, to enforce its authorities as supervisory body.

Keywords: AEC blueprint, OJK, capital market, integration

Procedia PDF Downloads 298
24129 Warfare Ships at Ancient Egypt: Since Pre-Historic Era (3700 B.C.) Uptill the End of the 2nd Intermediate Period (1550 B.C.)

Authors: Mohsen Negmeddin

Abstract:

Throughout their history, ancient Egyptians had known several kinds and types of boats, which were made from two main kinds of materials, the local one, as the dried papyrus reeds and the local tree trunks, the imported one, as the boats which were made from Lebanon cedar tree trunks. A varied using of these boats, as the fish hunting small boats, the transportation and trade boats "Cargo Boats", as well as the ceremonial boats, and the warfare boats. The research is intending for the last one, the warfare boats and the river/maritime battles since the beginning of ancient Egyptian civilization at the pre-historic era up till the end of the second intermediate period, to reveal the kinds and types of those fighting ships before establishing the Egyptian navy at the beginning of the New Kingdome (1550-1770 B.C). Two methods will follow at this research, the mention of names and titles of these ships through the texts (ancient Egyptian language) resources, and the depiction of it at the scenes.

Keywords: the warfare boats, the maritime battles, the pre-historic era, the second intermediate period

Procedia PDF Downloads 256
24128 The Applications of Toyota Production System to Reduce Wastes in Agricultural Products Packing Process: A Study of Onion Packing Plant

Authors: P. Larpsomboonchai

Abstract:

Agro-industry is one of major industries that has strong impacts on national economic incomes, growth, stability, and sustainable development. Moreover, this industry also has strong influences on social, cultural and political issues. Furthermore, this industry, as producing primary and secondary products, is facing challenges from such diverse factors such as demand inconsistency, intense international competition, technological advancements and new competitors. In order to maintain and to improve industry’s competitiveness in both domestics and international markets, science and technology are key factors. Besides hard sciences and technologies, modern industrial engineering concepts such as Just in Time (JIT) Total Quality Management (TQM), Quick Response (QR), Supply Chain Management (SCM) and Lean can be very effective to supportant to increase efficiency and effectiveness of these agricultural products on world stage. Onion is one of Thailand’s major export products which brings back national incomes. But, it also facing challenges in many ways. This paper focused its interests in onion packing process and its related activities such as storage and shipment from one of major packing plant and storage in Mae Wang District, Chiang Mai, Thailand, by applying Toyota Production System (TPS) or Lean concepts, to improve process capability throughout the entire packing and distribution process which will be profitable for the whole onion supply chain. And it will be beneficial to other related agricultural products in Thailand and other ASEAN countries.

Keywords: packing process, Toyota Production System (TPS), lean concepts, waste reduction, lean in agro-industries activities

Procedia PDF Downloads 260
24127 Evaluating Alternative Structures for Prefix Trees

Authors: Feras Hanandeh, Izzat Alsmadi, Muhammad M. Kwafha

Abstract:

Prefix trees or tries are data structures that are used to store data or index of data. The goal is to be able to store and retrieve data by executing queries in quick and reliable manners. In principle, the structure of the trie depends on having letters in nodes at the different levels to point to the actual words in the leafs. However, the exact structure of the trie may vary based on several aspects. In this paper, we evaluated different structures for building tries. Using datasets of words of different sizes, we evaluated the different forms of trie structures. Results showed that some characteristics may impact significantly, positively or negatively, the size and the performance of the trie. We investigated different forms and structures for the trie. Results showed that using an array of pointers in each level to represent the different alphabet letters is the best choice.

Keywords: data structures, indexing, tree structure, trie, information retrieval

Procedia PDF Downloads 442
24126 Predicting the Relationship Between Childhood Trauma on the Formation of Defense Mechanisms with the Mediating Role of Object Relations in Traders

Authors: Ahmadreza Jabalameli, Mohammad Ebrahimpour Borujeni

Abstract:

According to psychodynamic theories, the major personality structure of individuals is formed in the first years of life. Trauma is an inseparable and undeniable part of everyone's life and they inevitably struggle with many traumas that can have a very significant impact on their lives. The present study deals with the relationship between childhood trauma on the formation of defense mechanisms and the role of object relations. The present descriptive study is a correlation with structural equation modeling (SEM). Sample selection is available and consists of 200 knowledgeable traders in Jabalameli Information Technology Company. The results indicate that the experience of childhood trauma with a demographic moderating effect, through the mediating role of object relations can lead to vulnerability to ego reality functionality and immature and psychically disturbed defense mechanisms. In this regard, there is a significant negative relationship between childhood trauma and object relations with mature defense mechanisms.

Keywords: childhood trauma, defense mechanisms, object relations, trade

Procedia PDF Downloads 115
24125 An Efficient Approach for Speed up Non-Negative Matrix Factorization for High Dimensional Data

Authors: Bharat Singh Om Prakash Vyas

Abstract:

Now a day’s applications deal with High Dimensional Data have tremendously used in the popular areas. To tackle with such kind of data various approached has been developed by researchers in the last few decades. To tackle with such kind of data various approached has been developed by researchers in the last few decades. One of the problems with the NMF approaches, its randomized valued could not provide absolute optimization in limited iteration, but having local optimization. Due to this, we have proposed a new approach that considers the initial values of the decomposition to tackle the issues of computationally expensive. We have devised an algorithm for initializing the values of the decomposed matrix based on the PSO (Particle Swarm Optimization). Through the experimental result, we will show the proposed method converse very fast in comparison to other row rank approximation like simple NMF multiplicative, and ACLS techniques.

Keywords: ALS, NMF, high dimensional data, RMSE

Procedia PDF Downloads 328
24124 Integrating Time-Series and High-Spatial Remote Sensing Data Based on Multilevel Decision Fusion

Authors: Xudong Guan, Ainong Li, Gaohuan Liu, Chong Huang, Wei Zhao

Abstract:

Due to the low spatial resolution of MODIS data, the accuracy of small-area plaque extraction with a high degree of landscape fragmentation is greatly limited. To this end, the study combines Landsat data with higher spatial resolution and MODIS data with higher temporal resolution for decision-level fusion. Considering the importance of the land heterogeneity factor in the fusion process, it is superimposed with the weighting factor, which is to linearly weight the Landsat classification result and the MOIDS classification result. Three levels were used to complete the process of data fusion, that is the pixel of MODIS data, the pixel of Landsat data, and objects level that connect between these two levels. The multilevel decision fusion scheme was tested in two sites of the lower Mekong basin. We put forth a comparison test, and it was proved that the classification accuracy was improved compared with the single data source classification results in terms of the overall accuracy. The method was also compared with the two-level combination results and a weighted sum decision rule-based approach. The decision fusion scheme is extensible to other multi-resolution data decision fusion applications.

Keywords: image classification, decision fusion, multi-temporal, remote sensing

Procedia PDF Downloads 107
24123 Optimal Analysis of Structures by Large Wing Panel Using FEM

Authors: Byeong-Sam Kim, Kyeongwoo Park

Abstract:

In this study, induced structural optimization is performed to compare the trade-off between wing weight and induced drag for wing panel extensions, construction of wing panel and winglets. The aerostructural optimization problem consists of parameters with strength condition, and two maneuver conditions using residual stresses in panel production. The results of kinematic motion analysis presented a homogenization based theory for 3D beams and 3D shells for wing panel. This theory uses a kinematic description of the beam based on normalized displacement moments. The displacement of the wing is a significant design consideration as large deflections lead to large stresses and increased fatigue of components cause residual stresses. The stresses in the wing panel are small compared to the yield stress of aluminum alloy. This study describes the implementation of a large wing panel, aerostructural analysis and structural parameters optimization framework that couples a three-dimensional panel method.

Keywords: wing panel, aerostructural optimization, FEM, structural analysis

Procedia PDF Downloads 570
24122 Analysis of Cooperative Learning Behavior Based on the Data of Students' Movement

Authors: Wang Lin, Li Zhiqiang

Abstract:

The purpose of this paper is to analyze the cooperative learning behavior pattern based on the data of students' movement. The study firstly reviewed the cooperative learning theory and its research status, and briefly introduced the k-means clustering algorithm. Then, it used clustering algorithm and mathematical statistics theory to analyze the activity rhythm of individual student and groups in different functional areas, according to the movement data provided by 10 first-year graduate students. It also focused on the analysis of students' behavior in the learning area and explored the law of cooperative learning behavior. The research result showed that the cooperative learning behavior analysis method based on movement data proposed in this paper is feasible. From the results of data analysis, the characteristics of behavior of students and their cooperative learning behavior patterns could be found.

Keywords: behavior pattern, cooperative learning, data analyze, k-means clustering algorithm

Procedia PDF Downloads 167
24121 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 86
24120 A Security Cloud Storage Scheme Based Accountable Key-Policy Attribute-Based Encryption without Key Escrow

Authors: Ming Lun Wang, Yan Wang, Ning Ruo Sun

Abstract:

With the development of cloud computing, more and more users start to utilize the cloud storage service. However, there exist some issues: 1) cloud server steals the shared data, 2) sharers collude with the cloud server to steal the shared data, 3) cloud server tampers the shared data, 4) sharers and key generation center (KGC) conspire to steal the shared data. In this paper, we use advanced encryption standard (AES), hash algorithms, and accountable key-policy attribute-based encryption without key escrow (WOKE-AKP-ABE) to build a security cloud storage scheme. Moreover, the data are encrypted to protect the privacy. We use hash algorithms to prevent the cloud server from tampering the data uploaded to the cloud. Analysis results show that this scheme can resist conspired attacks.

Keywords: cloud storage security, sharing storage, attributes, Hash algorithm

Procedia PDF Downloads 371
24119 Report of the Sea Cucumber Stichopus hermanni from Umm Al-Maradim and Qaruh Islands in Kuwait

Authors: M. Al-Roumi, A. Al-Yaqout, A. Al-Baz

Abstract:

Recently, sea cucumbers have shown to be significant to global trade and incomes due to their high commercial value for the pharmaceutical and cosmetics industry. This rising demand for sea cucumber products has created increasing harvest stress on the natural populations and led to the depletion of sea cucumbers stocks worldwide and accordingly there is a big concern on the marine environment's health worldwide. Few species have been reported and identified via morophlogical features only. Several sea cucumber species were collected from the North West side reefs at Qaruh Island, and the north side of Umm Al-Maradem Island in Kuwait waters, in the north-western Arabian Gulf, in order to identify the sea cucumber species available in the Kuwaiti waters. The identified species were Holothuria atra, Holothuria arenicola, Holothuria hilla and Holothuria impatiens. Species identification was made using morphological keys and review of their ossicles. This paper reports the species Stichopus hermanni from Kuwait.

Keywords: Stichopus hermanni, Kuwait waters, Arabian Gulf, ossicles

Procedia PDF Downloads 174
24118 Study on Technological Development for Reducing the Sulfur Dioxide Residue Problem in Fresh Longan for Exporting

Authors: Wittaya Apai, Satippong Rattanakam, Suttinee Likhittragulrung, Nuttanai Tungmunkongvorakul, Sompetch Jaroensuk

Abstract:

The objective of this study was to find some alternative ways to decrease sulfur dioxide (SO₂) residue problem and prolong storage life in fresh longan for export. Office of Agricultural Research and Development Region 1, Chiang Mai province conducted the research and development from 2016-2018. A grade longan cv. Daw fruit with panicle attached was placed in 11.5 kg commercial perforated plastic basket. They had 5 selected treatments comprising of 3 baskets as replication for each treatment, i.e. 1.5% SO₂ fumigation prior to insert SO₂-generated pads (Uvasys®) (1.5% SO₂+SO₂ pad), dipping in 5% hydrochloric acid (HCl) mixed with 1% sodium metabisulfite (SMS) for 5 min (5% HCl +1% SMS), ozone (O₃) fumigation for 1 hours (h) prior to 1.5% SO₂ fumigation (O₃ 1 h+1.5% SO₂), 1.5% SO₂ fumigation prior to O₃ fumigation for 1 h (1.5% SO₂+O₃ 1 h) and 1.5% SO₂ fumigation alone as commercial treatment (1.5% SO₂). They were stored at 5 ˚C, 90% relative humidity (RH) for 40-80 days. The results found that the possible treatments were 1.5% SO₂+O₃ 1 h and 5% HCl +1% SMS respectively and prevented pericarp browning for 80 days at 5 ºC. There were no significant changes in some parameters in any treatments; 1.5% SO₂+O₃ 1 h and 1.5% SO₂ during storage, i.e., pericarp browning, flesh discoloration, disease incidence (%) and sensory evaluation during storage. Application 1.5% SO₂+O₃ 1 h had a tendency less both SO₂ residue in fruit and disease incidence (%) including brighter pericarp color as compared with commercial 1.5% SO₂ alone. Moreover, HCl 5%+SMS 1% showed the least SO₂ residue in whole fruit below codex tolerance at 50 mg/kg throughout period of time. The fruit treated with 1.5% SO₂+O₃ 1 h, 1.5% SO₂, 5% HCl+1% SMS, O₃ 1 h+1.5% SO₂, and 1.5% SO₂+SO₂ pad could prolong storage life for 40, 40, 40, 30 and 30 days respectively at 5°C, 90% RH. Thus, application 1.5% SO₂+O₃ 1 h and/or 5% HCl +1% SMS could be used for extending shelf life fresh longan exported to restricted countries due to less SO₂ residue and fruit quality was maintained as compared with the conventional method.

Keywords: longan, sulfur dioxide, ozone fumigation, sodium metabisulfite

Procedia PDF Downloads 108
24117 In-door Localization Algorithm and Appropriate Implementation Using Wireless Sensor Networks

Authors: Adeniran K. Ademuwagun, Alastair Allen

Abstract:

The relationship dependence between RSS and distance in an enclosed environment is an important consideration because it is a factor that can influence the reliability of any localization algorithm founded on RSS. Several algorithms effectively reduce the variance of RSS to improve localization or accuracy performance. Our proposed algorithm essentially avoids this pitfall and consequently, its high adaptability in the face of erratic radio signal. Using 3 anchors in close proximity of each other, we are able to establish that RSS can be used as reliable indicator for localization with an acceptable degree of accuracy. Inherent in this concept, is the ability for each prospective anchor to validate (guarantee) the position or the proximity of the other 2 anchors involved in the localization and vice versa. This procedure ensures that the uncertainties of radio signals due to multipath effects in enclosed environments are minimized. A major driver of this idea is the implicit topological relationship among sensors due to raw radio signal strength. The algorithm is an area based algorithm; however, it does not trade accuracy for precision (i.e the size of the returned area).

Keywords: anchor nodes, centroid algorithm, communication graph, radio signal strength

Procedia PDF Downloads 491
24116 The Study on Life of Valves Evaluation Based on Tests Data

Authors: Binjuan Xu, Qian Zhao, Ping Jiang, Bo Guo, Zhijun Cheng, Xiaoyue Wu

Abstract:

Astronautical valves are key units in engine systems of astronautical products; their reliability will influence results of rocket or missile launching, even lead to damage to staff and devices on the ground. Besides failure in engine system may influence the hitting accuracy and flight shot of missiles. Therefore high reliability is quite essential to astronautical products. There are quite a few literature doing research based on few failure test data to estimate valves’ reliability, thus this paper proposed a new method to estimate valves’ reliability, according to the corresponding tests of different failure modes, this paper takes advantage of tests data which acquired from temperature, vibration, and action tests to estimate reliability in every failure modes, then this paper has regarded these three kinds of tests as three stages in products’ process to integrate these results to acquire valves’ reliability. Through the comparison of results achieving from tests data and simulated data, the results have illustrated how to obtain valves’ reliability based on the few failure data with failure modes and prove that the results are effective and rational.

Keywords: censored data, temperature tests, valves, vibration tests

Procedia PDF Downloads 323
24115 Development of Energy Benchmarks Using Mandatory Energy and Emissions Reporting Data: Ontario Post-Secondary Residences

Authors: C. Xavier Mendieta, J. J McArthur

Abstract:

Governments are playing an increasingly active role in reducing carbon emissions, and a key strategy has been the introduction of mandatory energy disclosure policies. These policies have resulted in a significant amount of publicly available data, providing researchers with a unique opportunity to develop location-specific energy and carbon emission benchmarks from this data set, which can then be used to develop building archetypes and used to inform urban energy models. This study presents the development of such a benchmark using the public reporting data. The data from Ontario’s Ministry of Energy for Post-Secondary Educational Institutions are being used to develop a series of building archetype dynamic building loads and energy benchmarks to fill a gap in the currently available building database. This paper presents the development of a benchmark for college and university residences within ASHRAE climate zone 6 areas in Ontario using the mandatory disclosure energy and greenhouse gas emissions data. The methodology presented includes data cleaning, statistical analysis, and benchmark development, and lessons learned from this investigation are presented and discussed to inform the development of future energy benchmarks from this larger data set. The key findings from this initial benchmarking study are: (1) the importance of careful data screening and outlier identification to develop a valid dataset; (2) the key features used to develop a model of the data are building age, size, and occupancy schedules and these can be used to estimate energy consumption; and (3) policy changes affecting the primary energy generation significantly affected greenhouse gas emissions, and consideration of these factors was critical to evaluate the validity of the reported data.

Keywords: building archetypes, data analysis, energy benchmarks, GHG emissions

Procedia PDF Downloads 287
24114 Collision Detection Algorithm Based on Data Parallelism

Authors: Zhen Peng, Baifeng Wu

Abstract:

Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.

Keywords: data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability

Procedia PDF Downloads 271
24113 Changing Arbitrary Data Transmission Period by Using Bluetooth Module on Gas Sensor Node of Arduino Board

Authors: Hiesik Kim, Yong-Beom Kim, Jaheon Gu

Abstract:

Internet of Things (IoT) applications are widely serviced and spread worldwide. Local wireless data transmission technique must be developed to rate up with some technique. Bluetooth wireless data communication is wireless technique is technique made by Special Inter Group (SIG) using the frequency range 2.4 GHz, and it is exploiting Frequency Hopping to avoid collision with a different device. To implement experiment, equipment for experiment transmitting measured data is made by using Arduino as open source hardware, gas sensor, and Bluetooth module and algorithm controlling transmission rate is demonstrated. Experiment controlling transmission rate also is progressed by developing Android application receiving measured data, and controlling this rate is available at the experiment result. It is important that in the future, improvement for communication algorithm be needed because a few error occurs when data is transferred or received.

Keywords: Arduino, Bluetooth, gas sensor, IoT, transmission

Procedia PDF Downloads 264
24112 Real-Time Sensor Fusion for Mobile Robot Localization in an Oil and Gas Refinery

Authors: Adewole A. Ayoade, Marshall R. Sweatt, John P. H. Steele, Qi Han, Khaled Al-Wahedi, Hamad Karki, William A. Yearsley

Abstract:

Understanding the behavioral characteristics of sensors is a crucial step in fusing data from several sensors of different types. This paper introduces a practical, real-time approach to integrate heterogeneous sensor data to achieve higher accuracy than would be possible from any one individual sensor in localizing a mobile robot. We use this approach in both indoor and outdoor environments and it is especially appropriate for those environments like oil and gas refineries due to their sparse and featureless nature. We have studied the individual contribution of each sensor data to the overall combined accuracy achieved from the fusion process. A Sequential Update Extended Kalman Filter(EKF) using validation gates was used to integrate GPS data, Compass data, WiFi data, Inertial Measurement Unit(IMU) data, Vehicle Velocity, and pose estimates from Fiducial marker system. Results show that the approach can enable a mobile robot to navigate autonomously in any environment using a priori information.

Keywords: inspection mobile robot, navigation, sensor fusion, sequential update extended Kalman filter

Procedia PDF Downloads 452
24111 Energy Efficient Massive Data Dissemination Through Vehicle Mobility in Smart Cities

Authors: Salman Naseer

Abstract:

One of the main challenges of operating a smart city (SC) is collecting the massive data generated from multiple data sources (DS) and to transmit them to the control units (CU) for further data processing and analysis. These ever-increasing data demands require not only more and more capacity of the transmission channels but also results in resource over-provision to meet the resilience requirements, thus the unavoidable waste because of the data fluctuations throughout the day. In addition, the high energy consumption (EC) and carbon discharges from these data transmissions posing serious issues to the environment we live in. Therefore, to overcome the issues of intensive EC and carbon emissions (CE) of massive data dissemination in Smart Cities, we propose an energy efficient and carbon reduction approach by utilizing the daily mobility of the existing vehicles as an alternative communications channel to accommodate the data dissemination in smart cities. To illustrate the effectiveness and efficiency of our approach, we take the Auckland City in New Zealand as an example, assuming massive data generated by various sources geographically scattered throughout the Auckland region to the control centres located in city centre. The numerical results show that our proposed approach can provide up to 5 times lower delay as transferring the large volume of data by utilizing the existing daily vehicles’ mobility than the conventional transmission network. Moreover, our proposed approach offers about 30% less EC and CE than that of conventional network transmission approach.

Keywords: smart city, delay tolerant network, infrastructure offloading, opportunistic network, vehicular mobility, energy consumption, carbon emission

Procedia PDF Downloads 124