Search results for: data source
25718 AI-Enabled Smart Contracts for Reliable Traceability in the Industry 4.0
Authors: Harris Niavis, Dimitra Politaki
Abstract:
The manufacturing industry was collecting vast amounts of data for monitoring product quality thanks to the advances in the ICT sector and dedicated IoT infrastructure is deployed to track and trace the production line. However, industries have not yet managed to unleash the full potential of these data due to defective data collection methods and untrusted data storage and sharing. Blockchain is gaining increasing ground as a key technology enabler for Industry 4.0 and the smart manufacturing domain, as it enables the secure storage and exchange of data between stakeholders. On the other hand, AI techniques are more and more used to detect anomalies in batch and time-series data that enable the identification of unusual behaviors. The proposed scheme is based on smart contracts to enable automation and transparency in the data exchange, coupled with anomaly detection algorithms to enable reliable data ingestion in the system. Before sensor measurements are fed to the blockchain component and the smart contracts, the anomaly detection mechanism uniquely combines artificial intelligence models to effectively detect unusual values such as outliers and extreme deviations in data coming from them. Specifically, Autoregressive integrated moving average, Long short-term memory (LSTM) and Dense-based autoencoders, as well as Generative adversarial networks (GAN) models, are used to detect both point and collective anomalies. Towards the goal of preserving the privacy of industries' information, the smart contracts employ techniques to ensure that only anonymized pointers to the actual data are stored on the ledger while sensitive information remains off-chain. In the same spirit, blockchain technology guarantees the security of the data storage through strong cryptography as well as the integrity of the data through the decentralization of the network and the execution of the smart contracts by the majority of the blockchain network actors. The blockchain component of the Data Traceability Software is based on the Hyperledger Fabric framework, which lays the ground for the deployment of smart contracts and APIs to expose the functionality to the end-users. The results of this work demonstrate that such a system can increase the quality of the end-products and the trustworthiness of the monitoring process in the smart manufacturing domain. The proposed AI-enabled data traceability software can be employed by industries to accurately trace and verify records about quality through the entire production chain and take advantage of the multitude of monitoring records in their databases.Keywords: blockchain, data quality, industry4.0, product quality
Procedia PDF Downloads 18925717 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction
Authors: Qais M. Yousef, Yasmeen A. Alshaer
Abstract:
Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.Keywords: artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization
Procedia PDF Downloads 17525716 IoT Based Approach to Healthcare System for a Quadriplegic Patient Using EEG
Authors: R. Gautam, P. Sastha Kanagasabai, G. N. Rathna
Abstract:
The proposed healthcare system enables quadriplegic patients, people with severe motor disabilities to send commands to electronic devices and monitor their vitals. The growth of Brain-Computer-Interface (BCI) has led to rapid development in 'assistive systems' for the disabled called 'assistive domotics'. Brain-Computer-Interface is capable of reading the brainwaves of an individual and analyse it to obtain some meaningful data. This processed data can be used to assist people having speech disorders and sometimes people with limited locomotion to communicate. In this Project, Emotiv EPOC Headset is used to obtain the electroencephalogram (EEG). The obtained data is processed to communicate pre-defined commands over the internet to the desired mobile phone user. Other Vital Information like the heartbeat, blood pressure, ECG and body temperature are monitored and uploaded to the server. Data analytics enables physicians to scan databases for a specific illness. The Data is processed in Intel Edison, system on chip (SoC). Patient metrics are displayed via Intel IoT Analytics cloud service.Keywords: brain computer interface, Intel Edison, Emotiv EPOC, IoT analytics, electroencephalogram
Procedia PDF Downloads 18625715 Searchable Encryption in Cloud Storage
Authors: Ren Junn Hwang, Chung-Chien Lu, Jain-Shing Wu
Abstract:
Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying k-nearest neighbor technology. The protocol ranks the relevance scores of encrypted files and keywords, and prevents cloud servers from learning search keywords submitted by a cloud user. To reduce the costs of file transfer communication, the cloud server returns encrypted files in order of relevance. Moreover, when a cloud user inputs an incorrect keyword and the number of wrong alphabet does not exceed a given threshold; the user still can retrieve the target files from cloud server. In addition, the proposed scheme satisfies security requirements for outsourced data storage.Keywords: fault-tolerance search, multi-keywords search, outsource storage, ranked search, searchable encryption
Procedia PDF Downloads 38325714 A Bivariate Inverse Generalized Exponential Distribution and Its Applications in Dependent Competing Risks Model
Authors: Fatemah A. Alqallaf, Debasis Kundu
Abstract:
The aim of this paper is to introduce a bivariate inverse generalized exponential distribution which has a singular component. The proposed bivariate distribution can be used when the marginals have heavy-tailed distributions, and they have non-monotone hazard functions. Due to the presence of the singular component, it can be used quite effectively when there are ties in the data. Since it has four parameters, it is a very flexible bivariate distribution, and it can be used quite effectively for analyzing various bivariate data sets. Several dependency properties and dependency measures have been obtained. The maximum likelihood estimators cannot be obtained in closed form, and it involves solving a four-dimensional optimization problem. To avoid that, we have proposed to use an EM algorithm, and it involves solving only one non-linear equation at each `E'-step. Hence, the implementation of the proposed EM algorithm is very straight forward in practice. Extensive simulation experiments and the analysis of one data set have been performed. We have observed that the proposed bivariate inverse generalized exponential distribution can be used for modeling dependent competing risks data. One data set has been analyzed to show the effectiveness of the proposed model.Keywords: Block and Basu bivariate distributions, competing risks, EM algorithm, Marshall-Olkin bivariate exponential distribution, maximum likelihood estimators
Procedia PDF Downloads 14325713 Using Large Databases and Interviews to Explore the Temporal Phases of Technology-Based Entrepreneurial Ecosystems
Authors: Elsie L. Echeverri-Carroll
Abstract:
Entrepreneurial ecosystems have become an important concept to explain the birth and sustainability of technology-based entrepreneurship within regions. However, as a theoretical concept, the temporal evolution of entrepreneurship systems remain underdeveloped, making it difficult to understand their dynamic contributions to entrepreneurs. This paper argues that successful technology-based ecosystems go over three cumulative spawning stages: corporate spawning, entrepreneurial spawning, and community spawning. The importance of corporate incubation in vibrant entrepreneurial ecosystems is well documented in the entrepreneurial literature. Similarly, entrepreneurial spawning processes for venture capital-backed startups are well documented in the financial literature. In contrast, there is little understanding of both the third stage of entrepreneurial spawning (when a community of entrepreneurs become a source of firm spawning) and the temporal sequence in which spawning effects occur in a region. We test this three-stage model of entrepreneurial spawning using data from two large databases on firm births—the Secretary of State (160,000 observations) and the National Establishment Time Series (NEST with 150,000 observations)—and information collected from 60 1½-hour interviews with startup founders and representatives of key entrepreneurial organizations. This temporal model is illustrated with case study of Austin, Texas ranked by the Kauffman Foundation as the number one entrepreneurial city in the United States in 2015 and 2016. The 1½-year study founded by the Kauffman Foundation demonstrates the importance of taken into consideration the temporal contributions of both large and entrepreneurial firms in understanding the factors that contribute to the birth and growth of technology-based entrepreneurial regions. More important, these learnings could offer an important road map for regions that pursue to advance their entrepreneurial ecosystems.Keywords: entrepreneurial ecosystems, entrepreneurial industrial clusters, high-technology, temporal changes
Procedia PDF Downloads 27225712 Evaluation of NASA POWER and CRU Precipitation and Temperature Datasets over a Desert-prone Yobe River Basin: An Investigation of the Impact of Drought in the North-East Arid Zone of Nigeria
Authors: Yusuf Dawa Sidi, Abdulrahman Bulama Bizi
Abstract:
The most dependable and precise source of climate data is often gauge observation. However, long-term records of gauge observations, on the other hand, are unavailable in many regions around the world. In recent years, a number of gridded climate datasets with high spatial and temporal resolutions have emerged as viable alternatives to gauge-based measurements. However, it is crucial to thoroughly evaluate their performance prior to utilising them in hydroclimatic applications. Therefore, this study aims to assess the effectiveness of NASA Prediction of Worldwide Energy Resources (NASA POWER) and Climate Research Unit (CRU) datasets in accurately estimating precipitation and temperature patterns within the dry region of Nigeria from 1990 to 2020. The study employs widely used statistical metrics and the Standardised Precipitation Index (SPI) to effectively capture the monthly variability of precipitation and temperature and inter-annual anomalies in rainfall. The findings suggest that CRU exhibited superior performance compared to NASA POWER in terms of monthly precipitation and minimum and maximum temperatures, demonstrating a high correlation and much lower error values for both RMSE and MAE. Nevertheless, NASA POWER has exhibited a moderate agreement with gauge observations in accurately replicating monthly precipitation. The analysis of the SPI reveals that the CRU product exhibits superior performance compared to NASA POWER in accurately reflecting inter-annual variations in rainfall anomalies. The findings of this study indicate that the CRU gridded product is often regarded as the most favourable gridded precipitation product.Keywords: CRU, climate change, precipitation, SPI, temperature
Procedia PDF Downloads 8925711 Blind Data Hiding Technique Using Interpolation of Subsampled Images
Authors: Singara Singh Kasana, Pankaj Garg
Abstract:
In this paper, a blind data hiding technique based on interpolation of sub sampled versions of a cover image is proposed. Sub sampled image is taken as a reference image and an interpolated image is generated from this reference image. Then difference between original cover image and interpolated image is used to embed secret data. Comparisons with the existing interpolation based techniques show that proposed technique provides higher embedding capacity and better visual quality marked images. Moreover, the performance of the proposed technique is more stable for different images.Keywords: interpolation, image subsampling, PSNR, SIM
Procedia PDF Downloads 57825710 On the Possibility of Real Time Characterisation of Ambient Toxicity Using Multi-Wavelength Photoacoustic Instrument
Authors: Tibor Ajtai, Máté Pintér, Noémi Utry, Gergely Kiss-Albert, Andrea Palágyi, László Manczinger, Csaba Vágvölgyi, Gábor Szabó, Zoltán Bozóki
Abstract:
According to the best knowledge of the authors, here we experimentally demonstrate first, a quantified correlation between the real-time measured optical feature of the ambient and the off-line measured toxicity data. Finally, using these correlations we are presenting a novel methodology for real time characterisation of ambient toxicity based on the multi wavelength aerosol phase photoacoustic measurement. Ambient carbonaceous particulate matter is one of the most intensively studied atmospheric constituent in climate science nowadays. Beyond their climatic impact, atmospheric soot also plays an important role as an air pollutant that harms human health. Moreover, according to the latest scientific assessments ambient soot is the second most important anthropogenic emission source, while in health aspect its being one of the most harmful atmospheric constituents as well. Despite of its importance, generally accepted standard methodology for the quantitative determination of ambient toxicology is not available yet. Dominantly, ambient toxicology measurement is based on the posterior analysis of filter accumulated aerosol with limited time resolution. Most of the toxicological studies are based on operational definitions using different measurement protocols therefore the comprehensive analysis of the existing data set is really limited in many cases. The situation is further complicated by the fact that even during its relatively short residence time the physicochemical features of the aerosol can be masked significantly by the actual ambient factors. Therefore, decreasing the time resolution of the existing methodology and developing real-time methodology for air quality monitoring are really actual issues in the air pollution research. During the last decades many experimental studies have verified that there is a relation between the chemical composition and the absorption feature quantified by Absorption Angström Exponent (AAE) of the carbonaceous particulate matter. Although the scientific community are in the common platform that the PhotoAcoustic Spectroscopy (PAS) is the only methodology that can measure the light absorption by aerosol with accurate and reliable way so far, the multi-wavelength PAS which are able to selectively characterise the wavelength dependency of absorption has become only available in the last decade. In this study, the first results of the intensive measurement campaign focusing the physicochemical and toxicological characterisation of ambient particulate matter are presented. Here we demonstrate the complete microphysical characterisation of winter time urban ambient including optical absorption and scattering as well as size distribution using our recently developed state of the art multi-wavelength photoacoustic instrument (4λ-PAS), integrating nephelometer (Aurora 3000) as well as single mobility particle sizer and optical particle counter (SMPS+C). Beyond this on-line characterisation of the ambient, we also demonstrate the results of the eco-, cyto- and genotoxicity measurements of ambient aerosol based on the posterior analysis of filter accumulated aerosol with 6h time resolution. We demonstrate a diurnal variation of toxicities and AAE data deduced directly from the multi-wavelength absorption measurement results.Keywords: photoacoustic spectroscopy, absorption Angström exponent, toxicity, Ames-test
Procedia PDF Downloads 30225709 Use of Electrochemical Methods for the Inhibition of Scaling with Green Products
Authors: Samira Ghizellaoui, Manel Boumagoura
Abstract:
The municipality of Constantine in eastern Algeria draws water from the Hamma groundwater source. The high fouling capacity is due to the high content of bicarbonate (442 mg/L) and calcium (136 mg/L). This work focuses on the use of three new green inhibitors for reducing calcium carbonate scale formation: gallic acid, quercetin and alginate, and on the comparison between them. These inhibitors have proven to be green antiscalants because they have no impact on the environment. Electrochemical methods (chronoamperometry and impedancemetry) were used to evaluate their performance. According to the study, these inhibitors are excellent green chemical inhibitors of scaling, and the best inhibitor is quercetin because it gave a good result with a lower concentration (2mg/L) compared to others inhibitors.Keywords: scaling, green inhibitor, chronoamperometry, impedancemetry
Procedia PDF Downloads 11625708 Active Contours for Image Segmentation Based on Complex Domain Approach
Authors: Sajid Hussain
Abstract:
The complex domain approach for image segmentation based on active contour has been designed, which deforms step by step to partition an image into numerous expedient regions. A novel region-based trigonometric complex pressure force function is proposed, which propagates around the region of interest using image forces. The signed trigonometric force function controls the propagation of the active contour and the active contour stops on the exact edges of the object accurately. The proposed model makes the level set function binary and uses Gaussian smoothing kernel to adjust and escape the re-initialization procedure. The working principle of the proposed model is as follows: The real image data is transformed into complex data by iota (i) times of image data and the average iota (i) times of horizontal and vertical components of the gradient of image data is inserted in the proposed model to catch complex gradient of the image data. A simple finite difference mathematical technique has been used to implement the proposed model. The efficiency and robustness of the proposed model have been verified and compared with other state-of-the-art models.Keywords: image segmentation, active contour, level set, Mumford and Shah model
Procedia PDF Downloads 11425707 Discerning Divergent Nodes in Social Networks
Authors: Mehran Asadi, Afrand Agah
Abstract:
In data mining, partitioning is used as a fundamental tool for classification. With the help of partitioning, we study the structure of data, which allows us to envision decision rules, which can be applied to classification trees. In this research, we used online social network dataset and all of its attributes (e.g., Node features, labels, etc.) to determine what constitutes an above average chance of being a divergent node. We used the R statistical computing language to conduct the analyses in this report. The data were found on the UC Irvine Machine Learning Repository. This research introduces the basic concepts of classification in online social networks. In this work, we utilize overfitting and describe different approaches for evaluation and performance comparison of different classification methods. In classification, the main objective is to categorize different items and assign them into different groups based on their properties and similarities. In data mining, recursive partitioning is being utilized to probe the structure of a data set, which allow us to envision decision rules and apply them to classify data into several groups. Estimating densities is hard, especially in high dimensions, with limited data. Of course, we do not know the densities, but we could estimate them using classical techniques. First, we calculated the correlation matrix of the dataset to see if any predictors are highly correlated with one another. By calculating the correlation coefficients for the predictor variables, we see that density is strongly correlated with transitivity. We initialized a data frame to easily compare the quality of the result classification methods and utilized decision trees (with k-fold cross validation to prune the tree). The method performed on this dataset is decision trees. Decision tree is a non-parametric classification method, which uses a set of rules to predict that each observation belongs to the most commonly occurring class label of the training data. Our method aggregates many decision trees to create an optimized model that is not susceptible to overfitting. When using a decision tree, however, it is important to use cross-validation to prune the tree in order to narrow it down to the most important variables.Keywords: online social networks, data mining, social cloud computing, interaction and collaboration
Procedia PDF Downloads 15725706 Comparison of Different k-NN Models for Speed Prediction in an Urban Traffic Network
Authors: Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
A database that records average traffic speeds measured at five-minute intervals for all the links in the traffic network of a metropolitan city. While learning from this data the models that can predict future traffic speed would be beneficial for the applications such as the car navigation system, building predictive models for every link becomes a nontrivial job if the number of links in a given network is huge. An advantage of adopting k-nearest neighbor (k-NN) as predictive models is that it does not require any explicit model building. Instead, k-NN takes a long time to make a prediction because it needs to search for the k-nearest neighbors in the database at prediction time. In this paper, we investigate how much we can speed up k-NN in making traffic speed predictions by reducing the amount of data to be searched for without a significant sacrifice of prediction accuracy. The rationale behind this is that we had a better look at only the recent data because the traffic patterns not only repeat daily or weekly but also change over time. In our experiments, we build several different k-NN models employing different sets of features which are the current and past traffic speeds of the target link and the neighbor links in its up/down-stream. The performances of these models are compared by measuring the average prediction accuracy and the average time taken to make a prediction using various amounts of data.Keywords: big data, k-NN, machine learning, traffic speed prediction
Procedia PDF Downloads 36325705 Comparative Analysis of Classification Methods in Determining Non-Active Student Characteristics in Indonesia Open University
Authors: Dewi Juliah Ratnaningsih, Imas Sukaesih Sitanggang
Abstract:
Classification is one of data mining techniques that aims to discover a model from training data that distinguishes records into the appropriate category or class. Data mining classification methods can be applied in education, for example, to determine the classification of non-active students in Indonesia Open University. This paper presents a comparison of three methods of classification: Naïve Bayes, Bagging, and C.45. The criteria used to evaluate the performance of three methods of classification are stratified cross-validation, confusion matrix, the value of the area under the ROC Curve (AUC), Recall, Precision, and F-measure. The data used for this paper are from the non-active Indonesia Open University students in registration period of 2004.1 to 2012.2. Target analysis requires that non-active students were divided into 3 groups: C1, C2, and C3. Data analyzed are as many as 4173 students. Results of the study show: (1) Bagging method gave a high degree of classification accuracy than Naïve Bayes and C.45, (2) the Bagging classification accuracy rate is 82.99 %, while the Naïve Bayes and C.45 are 80.04 % and 82.74 % respectively, (3) the result of Bagging classification tree method has a large number of nodes, so it is quite difficult in decision making, (4) classification of non-active Indonesia Open University student characteristics uses algorithms C.45, (5) based on the algorithm C.45, there are 5 interesting rules which can describe the characteristics of non-active Indonesia Open University students.Keywords: comparative analysis, data mining, clasiffication, Bagging, Naïve Bayes, C.45, non-active students, Indonesia Open University
Procedia PDF Downloads 31525704 Comparative Rumen Degradable and Rumen Undegradable Fractions in Untreated, Formaldehyde and Heat Treated Vegetable Protein Sources of Pakistan
Authors: Illahi Bakhsh Marghazani, Nasrullah, Masood Ul Haq Kakar, Abdul Hameed Baloch, Ahmad Nawaz Khoso, Behram Chacher
Abstract:
Protein sources are the major part of ration fed to dairy buffaloes in Pakistan however, the limited availability and lack of judicious use of protein resources are further aggravating the conditions to enhance milk and meat production. In order to gain maximum production from limited protein source availability, it is necessary to balance feed for rumen degradable and rumen undegradable protein fractions. This study planned to know the rumen degradable and rumen undegradable fractions in all vegetable protein sources with (formaldehyde and heat treatment) and without treatments. Samples of soybean meal, corn gluten meal 60%, maize gluten feed, guar meal, sunflower meal, rapeseed meal, rapeseed cake, canola meal, cottonseed cake, cottonseed meal, coconut cake, coconut meal, palm kernel cake, almond cake and sesame cake were collected from ten different geographical locations of Pakistan. These samples were also subjected to formaldehyde (1% /100g CP of test feed) and heat treatments (1 hr at 15 lb psi/100 g CP of test feed). In situ technique was used to know the ruminal degradability characteristics. Data obtained were fitted to Orskove equation. Results showed that both treatments significantly (P < 0.05) decreased ruminal degradability in all vegetable protein sources than untreated vegetable protein sources, however, of both treatments, heat treatment was more effective than formaldehyde treatment in decreasing ruminal degradability in most of the studied vegetable protein sources.Keywords: formaldehyde and heat treatments, in situ technique, rumen degradable and rumen undegradable fractions, vegetable protein sources
Procedia PDF Downloads 33425703 An Analysis of Eco-efficiency and GHG Emission of Olive Oil Production in Northeast of Portugal
Authors: M. Feliciano, F. Maia, A. Gonçalves
Abstract:
Olive oil production sector plays an important role in Portuguese economy. It had a major growth over the last decade, increasing its weight in the overall national exports. International market penetration for Mediterranean traditional products is increasingly more demanding, especially in the Northern European markets, where consumers are looking for more sustainable products. Trying to support this growing demand this study addresses olive oil production under the environmental and eco-efficiency perspectives. The analysis considers two consecutive product life cycle stages: olive trees farming; and olive oil extraction in mills. Addressing olive farming, data collection covered two different organizations: a middle-size farm (~12ha) (F1) and a large-size farm (~100ha) (F2). Results from both farms show that olive collection activities are responsible for the largest amounts of Green House Gases (GHG) emissions. In this activities, estimate for the Carbon Footprint per olive was higher in F2 (188g CO2e/kgolive) than in F1 (148g CO2e/kgolive). Considering olive oil extraction, two different mills were considered: one using a two-phase system (2P) and other with a three-phase system (3P). Results from the study of two mills show that there is a much higher use of water in 3P. Energy intensity (EI) is similar in both mills. When evaluating the GHG generated, two conditions are evaluated: a biomass neutral condition resulting on a carbon footprint higher in 3P (184g CO2e/Lolive oil) than in 2P (92g CO2e/Lolive oil); and a non-neutral biomass condition in which 2P increase its carbon footprint to 273g CO2e/Lolive oil. When addressing the carbon footprint of possible combinations among studied subsystems, results suggest that olive harvesting is the major source for GHG.Keywords: carbon footprint, environmental indicators, farming subsystem, industrial subsystem, olive oil
Procedia PDF Downloads 28725702 New Evaluation of the Richness of Cactus (Opuntia) in Active Biomolecules and their Use in Agri-Food, Cosmetic, and Pharmaceutical
Authors: Lazhar Zourgui
Abstract:
Opuntia species are used as local medicinal interventions for chronic diseases and as food sources, mainly because they possess nutritional properties and biological activities. Opuntia ficus-indica (L.) Mill, commonly known as prickly pear or nopal cactus, is the most economically valuable plant in the Cactaceae family worldwide. It is a tropical or subtropical plant native to tropical and subtropical America, which can grow in arid and semi-arid climates. It belongs to the family of angiosperms dicotyledons Cactaceae of which about 1500 species of cacti are known. The Opuntia plant is distributed throughout the world and has great economic potential. There are differences in the phytochemical composition of Opuntia species between wild and domesticated species and within the same species. It is an interesting source of plant bioactive compounds. Bioactive compounds are compounds with nutritional benefits and are generally classified into phenolic and non-phenolic compounds and pigments. Opuntia species are able to grow in almost all climates, for example, arid, temperate, and tropical climates, and their bioactive compound profiles change depending on the species, cultivar, and climatic conditions. Therefore, there is an opportunity for the discovery of new compounds from different Opuntia cultivars. Health benefits of prickly pear are widely demonstrated: There is ample evidence of the health benefits of consuming prickly pear due to its source of nutrients and vitamins and its antioxidant properties due to its content of bioactive compounds. In addition, prickly pear is used in the treatment of hyperglycemia and high cholesterol levels, and its consumption is linked to a lower incidence of coronary heart disease and certain types of cancer. It may be effective in insulin-independent type 2 diabetes mellitus. Opuntia ficus-Indica seed oil has shown potent antioxidant and prophylactic effects. Industrial applications of these bioactive compounds are increasing. In addition to their application in the pharmaceutical industries, bioactive compounds are used in the food industry for the production of nutraceuticals and new food formulations (juices, drinks, jams, sweeteners). In my lecture, I will review in a comprehensive way the phytochemical, nutritional, and bioactive compound composition of the different aerial and underground parts of Opuntia species. The biological activities and applications of Opuntia compounds are also discussed.Keywords: medicinal plants, cactus, Opuntia, actives biomolecules, biological activities
Procedia PDF Downloads 10625701 Estimation of Greenhouse Gas (GHG) Reductions from Solar Cell Technology Using Bottom-up Approach and Scenario Analysis in South Korea
Authors: Jaehyung Jung, Kiman Kim, Heesang Eum
Abstract:
Solar cell is one of the main technologies to reduce greenhouse gas (GHG). Thereby, accurate estimation of greenhouse gas reduction by solar cell technology is crucial to consider strategic applications of the solar cell. The bottom-up approach using operating data such as operation time and efficiency is one of the methodologies to improve the accuracy of the estimation. In this study, alternative GHG reductions from solar cell technology were estimated by a bottom-up approach to indirect emission source (scope 2) in Korea, 2015. In addition, the scenario-based analysis was conducted to assess the effect of technological change with respect to efficiency improvement and rate of operation. In order to estimate GHG reductions from solar cell activities in operating condition levels, methodologies were derived from 2006 IPCC guidelines for national greenhouse gas inventories and guidelines for local government greenhouse inventories published in Korea, 2016. Indirect emission factors for electricity were obtained from Korea Power Exchange (KPX) in 2011. As a result, the annual alternative GHG reductions were estimated as 21,504 tonCO2eq, and the annual average value was 1,536 tonCO2eq per each solar cell technology. Those results of estimation showed to be 91% levels versus design of capacity. Estimation of individual greenhouse gases (GHGs) showed that the largest gas was carbon dioxide (CO2), of which up to 99% of the total individual greenhouse gases. The annual average GHG reductions from solar cell per year and unit installed capacity (MW) were estimated as 556 tonCO2eq/yr•MW. Scenario analysis of efficiency improvement by 5%, 10%, 15% increased as much as approximately 30, 61, 91%, respectively, and rate of operation as 100% increased 4% of the annual GHG reductions.Keywords: bottom-up approach, greenhouse gas (GHG), reduction, scenario, solar cell
Procedia PDF Downloads 22025700 Freight Forwarders’ Liability: A Need for Revival of Unidroit Draft Convention after Six Decades
Authors: Mojtaba Eshraghi Arani
Abstract:
The freight forwarders, who are known as the Architect of Transportation, play a vital role in the supply chain management. The package of various services which they provide has made the legal nature of freight forwarders very controversial, so that they might be qualified once as principal or carrier and, on other occasions, as agent of the shipper as the case may be. They could even be involved in the transportation process as the agent of shipping line, which makes the situation much more complicated. The courts in all countries have long had trouble in distinguishing the “forwarder as agent” from “forwarder as principal” (as it is outstanding in the prominent case of “Vastfame Camera Ltd v Birkart Globistics Ltd And Others” 2005, Hong Kong). It is not fully known that in the case of a claim against the forwarder, what particular parameter would be used by the judge among multiple, and sometimes contradictory, tests for determining the scope of the forwarder liability. In particular, every country has its own legal parameters for qualifying the freight forwarders that is completely different from others, as it is the case in France in comparison with Germany and England. The unpredictability of the courts’ decisions in this regard has provided the freight forwarders with the opportunity to impose any limitation or exception of liability while pretending to play the role of a principal, consequently making the cargo interests incur ever-increasing damage. The transportation industry needs to remove such uncertainty by unifying national laws governing freight forwarders liability. A long time ago, in 1967, The International Institute for Unification of Private Law (UNIDROIT) prepared a draft convention called “Draft Convention on Contract of Agency for Forwarding Agents Relating to International Carriage of Goods” (hereinafter called “UNIDROIT draft convention”). The UNIDROIT draft convention provided a clear and certain framework for the liability of freight forwarder in each capacity as agent or carrier, but it failed to transform to a convention, and eventually, it was consigned to oblivion. Today, after nearly 6 decades from that era, the necessity of such convention can be felt apparently. However, one might reason that the same grounds, in particular, the resistance by forwarders’ association, FIATA, exist yet, and thus it is not logical to revive a forgotten draft convention after such long period of time. It is argued in this article that the main reason for resisting the UNIDROIT draft convention in the past was pending efforts for developing the “1980 United Nation Convention on International Multimodal Transport of Goods”. However, the latter convention failed to become in force on due time in a way that there was no new accession since 1996, as a result of which the UNIDROIT draft convention must be revived strongly and immediately submitted to the relevant diplomatic conference. A qualitative method with the concept of interpretation of data collection has been used in this manuscript. The source of the data is the analysis of international conventions and cases.Keywords: freight forwarder, revival, agent, principal, uidroit, draft convention
Procedia PDF Downloads 7425699 A Study of the Adaptive Reuse for School Land Use Strategy: An Application of the Analytic Network Process and Big Data
Authors: Wann-Ming Wey
Abstract:
In today's popularity and progress of information technology, the big data set and its analysis are no longer a major conundrum. Now, we could not only use the relevant big data to analysis and emulate the possible status of urban development in the near future, but also provide more comprehensive and reasonable policy implementation basis for government units or decision-makers via the analysis and emulation results as mentioned above. In this research, we set Taipei City as the research scope, and use the relevant big data variables (e.g., population, facility utilization and related social policy ratings) and Analytic Network Process (ANP) approach to implement in-depth research and discussion for the possible reduction of land use in primary and secondary schools of Taipei City. In addition to enhance the prosperous urban activities for the urban public facility utilization, the final results of this research could help improve the efficiency of urban land use in the future. Furthermore, the assessment model and research framework established in this research also provide a good reference for schools or other public facilities land use and adaptive reuse strategies in the future.Keywords: adaptive reuse, analytic network process, big data, land use strategy
Procedia PDF Downloads 20325698 Interoperability Standard for Data Exchange in Educational Documents in Professional and Technological Education: A Comparative Study and Feasibility Analysis for the Brazilian Context
Authors: Giovana Nunes Inocêncio
Abstract:
The professional and technological education (EPT) plays a pivotal role in equipping students for specialized careers, and it is imperative to establish a framework for efficient data exchange among educational institutions. The primary focus of this article is to address the pressing need for document interoperability within the context of EPT. The challenges, motivations, and benefits of implementing interoperability standards for digital educational documents are thoroughly explored. These documents include EPT completion certificates, academic records, and curricula. In conjunction with the prior abstract, it is evident that the intersection of IT governance and interoperability standards holds the key to transforming the landscape of technical education in Brazil. IT governance provides the strategic framework for effective data management, aligning with educational objectives, ensuring compliance, and managing risks. By adopting interoperability standards, the technical education sector in Brazil can facilitate data exchange, enhance data security, and promote international recognition of qualifications. The utilization of the XML (Extensible Markup Language) standard further strengthens the foundation for structured data exchange, fostering efficient communication, standardization of curricula, and enhancing educational materials. The IT governance, interoperability standards, and data management critical role in driving the quality, efficiency, and security of technical education. The adoption of these standards fosters transparency, stakeholder coordination, and regulatory compliance, ultimately empowering the technical education sector to meet the dynamic demands of the 21st century.Keywords: interoperability, education, standards, governance
Procedia PDF Downloads 7025697 Generating Real-Time Visual Summaries from Located Sensor-Based Data with Chorems
Authors: Z. Bouattou, R. Laurini, H. Belbachir
Abstract:
This paper describes a new approach for the automatic generation of the visual summaries dealing with cartographic visualization methods and sensors real time data modeling. Hence, the concept of chorems seems an interesting candidate to visualize real time geographic database summaries. Chorems have been defined by Roger Brunet (1980) as schematized visual representations of territories. However, the time information is not yet handled in existing chorematic map approaches, issue has been discussed in this paper. Our approach is based on spatial analysis by interpolating the values recorded at the same time, by sensors available, so we have a number of distributed observations on study areas and used spatial interpolation methods to find the concentration fields, from these fields and by using some spatial data mining procedures on the fly, it is possible to extract important patterns as geographic rules. Then, those patterns are visualized as chorems.Keywords: geovisualization, spatial analytics, real-time, geographic data streams, sensors, chorems
Procedia PDF Downloads 40125696 Need for Privacy in the Technological Era: An Analysis in the Indian Perspective
Authors: Amrashaa Singh
Abstract:
In the digital age and the large cyberspace, Data Protection and Privacy have become major issues in this technological era. There was a time when social media and online shopping websites were treated as a blessing for the people. But now the tables have turned, and the people have started to look at them with suspicion. They are getting aware of the privacy implications, and they do not feel as safe as they used to initially. When Edward Snowden informed the world about the snooping United States Security Agencies had been doing, that is when the picture became clear for the people. After the Cambridge Analytica case where the data of Facebook users were stored without their consent, the doubts arose in the minds of people about how safe they actually are. In India, the case of spyware Pegasus also raised a lot of concerns. It was used to snoop on a lot of human right activists and lawyers and the company which invented the spyware claims that it only sells it to the government. The paper will be dealing with the privacy concerns in the Indian perspective with an analytical methodology. The Supreme Court here had recently declared a right to privacy a Fundamental Right under Article 21 of the Constitution of India. Further, the Government is also working on the Data Protection Bill. The point to note is that India is still a developing country, and with the bill, the government aims at data localization. But there are doubts in the minds of many people that the Government would actually be snooping on the data of the individuals. It looks more like an attempt to curb dissenters ‘lawfully’. The focus of the paper would be on these issues in India in light of the European Union (EU) General Data Protection Regulation (GDPR). The Indian Data Protection Bill is also said to be loosely based on EU GDPR. But how helpful would these laws actually be is another concern since the economic and social conditions in both countries are very different? The paper aims at discussing these concerns, how good or bad is the intention of the government behind the bill, and how the nations can act together and draft common regulations so that there is some uniformity in the laws and their application.Keywords: Article 21, data protection, dissent, fundamental right, India, privacy
Procedia PDF Downloads 11425695 An Online 3D Modeling Method Based on a Lossless Compression Algorithm
Authors: Jiankang Wang, Hongyang Yu
Abstract:
This paper proposes a portable online 3D modeling method. The method first utilizes a depth camera to collect data and compresses the depth data using a frame-by-frame lossless data compression method. The color image is encoded using the H.264 encoding format. After the cloud obtains the color image and depth image, a 3D modeling method based on bundlefusion is used to complete the 3D modeling. The results of this study indicate that this method has the characteristics of portability, online, and high efficiency and has a wide range of application prospects.Keywords: 3D reconstruction, bundlefusion, lossless compression, depth image
Procedia PDF Downloads 8225694 An Exploration of Policy-related Documents on District Heating and Cooling in Flanders: A Slow and Bottom-up Process
Authors: Isaura Bonneux
Abstract:
District heating and cooling (DHC) is increasingly recognized as a viable path towards sustainable heating and cooling. While some countries like Sweden and Denmark have a longstanding tradition of DHC, Belgium is lacking behind. The Northern part of Belgium, Flanders, had only a total of 95 heating networks in July 2023. Nevertheless, it is increasingly exploring its possibilities to enhance the scope of DHC. DHC is a complex energy system, requiring a lot of collaboration between various stakeholders on various levels. Therefore, it is of interest to look closer at policy-related documents at the Flemish (regional) level, as these policies set the scene for DHC development in the Flemish region. This kind of analysis has not been undertaken so far. This paper has the following research question: “Who talks about DHC, and in which way and context is DHC discussed in Flemish policy-related documents?” To answer this question, the Overton policy database was used to search and retrieve relevant policy-related documents. Overton retrieves data from governments, think thanks, NGOs, and IGOs. In total, out of the 244 original results, 117 documents between 2009 and 2023 were analyzed. Every selected document included theme keywords, policymaking department(s), date, and document type. These elements were used for quantitative data description and visualization. Further, qualitative content analysis revealed patterns and main themes regarding DHC in Flanders. Four main conclusions can be drawn: First, it is obvious from the timeframe that DHC is a new topic in Flanders with still limited attention; 2014, 2016 and 2017 were the years with the most documents, yet this number is still only 12 documents. In addition, many documents talked about DHC but not much in depth and painted it as a future scenario with a lot of uncertainty around it. The largest part of the issuing government departments had a link to either energy or climate (e.g. Flemish Environmental Agency) or policy (e.g. Socio-Economic Council of Flanders) Second, DHC is mentioned most within an ‘Environment and Sustainability’ context, followed by ‘General Policy and Regulation’. This is intuitive, as DHC is perceived as a sustainable heating and cooling technique and this analysis compromises policy-related documents. Third, Flanders seems mostly interested in using waste or residual heat as a heating source for DHC. The harbors and waste incineration plants are identified as potential and promising supply sources. This approach tries to conciliate environmental and economic incentives. Last, local councils get assigned a central role and the initiative is mostly taken by them. The policy documents and policy advices demonstrate that Flanders opts for a bottom-up organization. As DHC is very dependent on local conditions, this seems a logic step. Nevertheless, this can impede smaller councils to create DHC networks and slow down systematic and fast implementation of DHC throughout Flanders.Keywords: district heating and cooling, flanders, overton database, policy analysis
Procedia PDF Downloads 4425693 Chat-Based Online Counseling for Enhancing Wellness of Undergraduates with Emotional Crisis Tendency
Authors: Arunya Tuicomepee
Abstract:
During the past two decades, there have been the increasing numbers of studies on online counseling, especially among adolescents who are familiar with the online world. This can be explained by the fact that via this channel enables easier access to the young, who may not be ready for face-to-face service, possibly due to uneasiness to reveal their personal problems with a stranger, the feeling that their problems are to be shamed, or the need to protect their images. Especially, the group of teenagers prone to suicide or despair, who tend to keep things to or isolate from the society to themselves, usually prefer types of services that require no face-to-face encounter and allow their anonymity, such as online services. This study aimed to examine effectiveness of chat-based online counseling for enhancing wellness of undergraduates with emotional crisis tendency. Experimental with pretest-posttest control group design was employed. Participants were 47 undergraduates (10 males and 37 females) with high emotional crisis tendency. They were randomly assigned to experimental group (24 students) and control group (23 students). Participants in the experimental group received a 60-minute, 4-sessions of individual chat-based online counseling led by counselor. Those in control group received no counseling session. Instruments were the Emotional Crisis Scale and Wellness Scales. Two-way mixed-design multivariate analysis of variance was used for data analysis. Finding revealed that the posttest scores on wellness of those in the experimental group were higher than the scores of those in the control group. The posttest scores on emotional crisis tendency of those in the experimental group were lower than the scores of those in the control group. Hence, this study suggests chat-based online counseling services can become a helping source that increasing more adolescents would recognize and turn to in the future and that will receive more attention.Keywords: chat-based online counseling, emotional crisis, undergraduate student, wellness
Procedia PDF Downloads 24225692 H∞ Sampled-Data Control for Linear Systems Time-Varying Delays: Application to Power System
Authors: Chang-Ho Lee, Seung-Hoon Lee, Myeong-Jin Park, Oh-Min Kwon
Abstract:
This paper investigates improved stability criteria for sampled-data control of linear systems with disturbances and time-varying delays. Based on Lyapunov-Krasovskii stability theory, delay-dependent conditions sufficient to ensure H∞ stability for the system are derived in the form of linear matrix inequalities(LMI). The effectiveness of the proposed method will be shown in numerical examples.Keywords: sampled-data control system, Lyapunov-Krasovskii functional, time delay-dependent, LMI, H∞ control
Procedia PDF Downloads 32025691 Logistics Information Systems in the Distribution of Flour in Nigeria
Authors: Cornelius Femi Popoola
Abstract:
This study investigated logistics information systems in the distribution of flour in Nigeria. A case study design was used and 50 staff of Honeywell Flour Mill was sampled for the study. Data generated through a questionnaire were analysed using correlation and regression analysis. The findings of the study revealed that logistic information systems such as e-commerce, interactive telephone systems and electronic data interchange positively correlated with the distribution of flour in Honeywell Flour Mill. Finding also deduced that e-commerce, interactive telephone systems and electronic data interchange jointly and positively contribute to the distribution of flour in Honeywell Flour Mill in Nigeria (R = .935; Adj. R2 = .642; F (3,47) = 14.739; p < .05). The study therefore recommended that Honeywell Flour Mill should upgrade their logistic information systems to computer-to-computer communication of business transactions and documents, as well adopt new technology such as, tracking-and-tracing systems (barcode scanning for packages and palettes), tracking vehicles with Global Positioning System (GPS), measuring vehicle performance with ‘black boxes’ (containing logistic data), and Automatic Equipment Identification (AEI) into their systems.Keywords: e-commerce, electronic data interchange, flour distribution, information system, interactive telephone systems
Procedia PDF Downloads 55325690 Cascaded Neural Network for Internal Temperature Forecasting in Induction Motor
Authors: Hidir S. Nogay
Abstract:
In this study, two systems were created to predict interior temperature in induction motor. One of them consisted of a simple ANN model which has two layers, ten input parameters and one output parameter. The other one consisted of eight ANN models connected each other as cascaded. Cascaded ANN system has 17 inputs. Main reason of cascaded system being used in this study is to accomplish more accurate estimation by increasing inputs in the ANN system. Cascaded ANN system is compared with simple conventional ANN model to prove mentioned advantages. Dataset was obtained from experimental applications. Small part of the dataset was used to obtain more understandable graphs. Number of data is 329. 30% of the data was used for testing and validation. Test data and validation data were determined for each ANN model separately and reliability of each model was tested. As a result of this study, it has been understood that the cascaded ANN system produced more accurate estimates than conventional ANN model.Keywords: cascaded neural network, internal temperature, inverter, three-phase induction motor
Procedia PDF Downloads 34525689 Big Data and Health: An Australian Perspective Which Highlights the Importance of Data Linkage to Support Health Research at a National Level
Authors: James Semmens, James Boyd, Anna Ferrante, Katrina Spilsbury, Sean Randall, Adrian Brown
Abstract:
‘Big data’ is a relatively new concept that describes data so large and complex that it exceeds the storage or computing capacity of most systems to perform timely and accurate analyses. Health services generate large amounts of data from a wide variety of sources such as administrative records, electronic health records, health insurance claims, and even smart phone health applications. Health data is viewed in Australia and internationally as highly sensitive. Strict ethical requirements must be met for the use of health data to support health research. These requirements differ markedly from those imposed on data use from industry or other government sectors and may have the impact of reducing the capacity of health data to be incorporated into the real time demands of the Big Data environment. This ‘big data revolution’ is increasingly supported by national governments, who have invested significant funds into initiatives designed to develop and capitalize on big data and methods for data integration using record linkage. The benefits to health following research using linked administrative data are recognised internationally and by the Australian Government through the National Collaborative Research Infrastructure Strategy Roadmap, which outlined a multi-million dollar investment strategy to develop national record linkage capabilities. This led to the establishment of the Population Health Research Network (PHRN) to coordinate and champion this initiative. The purpose of the PHRN was to establish record linkage units in all Australian states, to support the implementation of secure data delivery and remote access laboratories for researchers, and to develop the Centre for Data Linkage for the linkage of national and cross-jurisdictional data. The Centre for Data Linkage has been established within Curtin University in Western Australia; it provides essential record linkage infrastructure necessary for large-scale, cross-jurisdictional linkage of health related data in Australia and uses a best practice ‘separation principle’ to support data privacy and security. Privacy preserving record linkage technology is also being developed to link records without the use of names to overcome important legal and privacy constraint. This paper will present the findings of the first ‘Proof of Concept’ project selected to demonstrate the effectiveness of increased record linkage capacity in supporting nationally significant health research. This project explored how cross-jurisdictional linkage can inform the nature and extent of cross-border hospital use and hospital-related deaths. The technical challenges associated with national record linkage, and the extent of cross-border population movements, were explored as part of this pioneering research project. Access to person-level data linked across jurisdictions identified geographical hot spots of cross border hospital use and hospital-related deaths in Australia. This has implications for planning of health service delivery and for longitudinal follow-up studies, particularly those involving mobile populations.Keywords: data integration, data linkage, health planning, health services research
Procedia PDF Downloads 216