Search results for: food composition data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28931

Search results for: food composition data

24251 Geoelectric Survey for Groundwater Potential in Waziri Umaru Federal Polytechnic, Birnin Kebbi, Nigeria

Authors: Ibrahim Mohammed, Suleiman Taofiq, Muhammad Naziru Yahya

Abstract:

Geoelectrical measurements using Schlumberger Vertical Electrical Sounding (VES) method were carried out in Waziri Umaru Federal Polytechnic, Birnin Kebbi, Nigeria, with the aim of determining the groundwater potential in the area. Twelve (12) Vertical Electric Sounding (VES) data were collected using Terrameter (ABEM SAS 300c) and analyzed using computer software (IPI2win), which gives an automatic interpretation of the apparent resistivity. The results of the interpretation of VES data were used in the characterization of three to five geo-electric layers from which the aquifer units were delineated. Data analysis indicated that water bearing formation exists in the third and fourth layers having resistivity range of 312 to 767 Ωm and 9.51 to 681 Ωm, respectively. The thickness of the formation ranges from 14.7 to 41.8 m, while the depth is from 8.22 to 53.7 m. Based on the result obtained from the interpretation of the data, five (5) VES stations were recommended as the most viable locations for groundwater exploration in the study area. The VES stations include VES A4, A5, A6, B1, and B2. The VES results of the entire area indicated that the water bearing formation occurs at maximum depth of 53.7 m at the time of this survey.

Keywords: aquifer, depth, groundwater, resistivity, Schlumberger

Procedia PDF Downloads 150
24250 The Integration of Patient Health Record Generated from Wearable and Internet of Things Devices into Health Information Exchanges

Authors: Dalvin D. Hill, Hector M. Castro Garcia

Abstract:

A growing number of individuals utilize wearable devices on a daily basis. The usage and functionality of these wearable devices vary from user to user. One popular usage of said devices is to track health-related activities that are typically stored on a device’s memory or uploaded to an account in the cloud; based on the current trend, the data accumulated from the wearable device are stored in a standalone location. In many of these cases, this health related datum is not a factor when considering the holistic view of a user’s health lifestyle or record. This health-related data generated from wearable and Internet of Things (IoT) devices can serve as empirical information to a medical provider, as the standalone data can add value to the holistic health record of a patient. This paper proposes a solution to incorporate the data gathered from these wearable and IoT devices, with that a patient’s Personal Health Record (PHR) stored within the confines of a Health Information Exchange (HIE).

Keywords: electronic health record, health information exchanges, internet of things, personal health records, wearable devices, wearables

Procedia PDF Downloads 112
24249 Drying Kinetics, Energy Requirement, Bioactive Composition, and Mathematical Modeling of Allium Cepa Slices

Authors: Felix U. Asoiro, Meshack I. Simeon, Chinenye E. Azuka, Harami Solomon, Chukwuemeka J. Ohagwu

Abstract:

The drying kinetics, specific energy consumed (SEC), effective moisture diffusivity (EMD), flavonoid, phenolic, and vitamin C contents of onion slices dried under convective oven drying (COD) were compared with microwave drying (MD). Drying was performed with onion slice thicknesses of 2, 4, 6, and 8 mm; air drying temperatures of 60, 80, and 100°C for COD, and microwave power of 450 W for MD. A decrease in slice thickness and an increase in drying air temperature led to a drop in the drying time. As thickness increased from 2 – 8 mm, EMD rose from 1.1-4.35 x 10⁻⁸ at 60°C, 1.1-5.6 x 10⁻⁸ at 80°C, and 1.25-6.12 x 10⁻⁸ at 100°C with MD treatments yielding the highest mean value (6.65 x 10⁻⁸ m² s⁻¹) at 8 mm. Maximum SEC for onion slices in COD was 238.27 kWh/kg H₂O (2 mm thickness), and the minimum was 39.4 kWh/kg H₂O (8 mm thickness) whereas maximum during MD was 25.33 kWh/kg H₂O (8 mm thickness) and minimum, 18.7 kWh/kg H₂O (2 mm thickness). MD treatment gave a significant (p 0.05) increase in the flavonoid (39.42 – 64.4%), phenolic (38.0 – 46.84%), and vitamin C (3.7 – 4.23 mg 100 g⁻¹) contents, while COD treatment at 60°C and 100°C had positive effects on only vitamin C and phenolic contents, respectively. In comparison, the Weibull model gave the overall best fit (highest R²=0.999; lowest SSE=0.0002, RSME=0.0123, and χ²= 0.0004) when drying 2 mm onion slices at 100°C.

Keywords: allium cepa, drying kinetics, specific energy consumption, flavonoid, vitamin C, microwave oven drying

Procedia PDF Downloads 119
24248 System Identification in Presence of Outliers

Authors: Chao Yu, Qing-Guo Wang, Dan Zhang

Abstract:

The outlier detection problem for dynamic systems is formulated as a matrix decomposition problem with low-rank, sparse matrices and further recast as a semidefinite programming (SDP) problem. A fast algorithm is presented to solve the resulting problem while keeping the solution matrix structure and it can greatly reduce the computational cost over the standard interior-point method. The computational burden is further reduced by proper construction of subsets of the raw data without violating low rank property of the involved matrix. The proposed method can make exact detection of outliers in case of no or little noise in output observations. In case of significant noise, a novel approach based on under-sampling with averaging is developed to denoise while retaining the saliency of outliers and so-filtered data enables successful outlier detection with the proposed method while the existing filtering methods fail. Use of recovered “clean” data from the proposed method can give much better parameter estimation compared with that based on the raw data.

Keywords: outlier detection, system identification, matrix decomposition, low-rank matrix, sparsity, semidefinite programming, interior-point methods, denoising

Procedia PDF Downloads 294
24247 Defining a Reference Architecture for Predictive Maintenance Systems: A Case Study Using the Microsoft Azure IoT-Cloud Components

Authors: Walter Bernhofer, Peter Haber, Tobias Mayer, Manfred Mayr, Markus Ziegler

Abstract:

Current preventive maintenance measures are cost intensive and not efficient. With the available sensor data of state of the art internet of things devices new possibilities of automated data processing emerge. Current advances in data science and in machine learning enable new, so called predictive maintenance technologies, which empower data scientists to forecast possible system failures. The goal of this approach is to cut expenses in preventive maintenance by automating the detection of possible failures and to improve efficiency and quality of maintenance measures. Additionally, a centralization of the sensor data monitoring can be achieved by using this approach. This paper describes the approach of three students to define a reference architecture for a predictive maintenance solution in the internet of things domain with a connected smartphone app for service technicians. The reference architecture is validated by a case study. The case study is implemented with current Microsoft Azure cloud technologies. The results of the case study show that the reference architecture is valid and can be used to achieve a system for predictive maintenance execution with the cloud components of Microsoft Azure. The used concepts are technology platform agnostic and can be reused in many different cloud platforms. The reference architecture is valid and can be used in many use cases, like gas station maintenance, elevator maintenance and many more.

Keywords: case study, internet of things, predictive maintenance, reference architecture

Procedia PDF Downloads 230
24246 Purification of Bacillus Lipopeptides for Diverse Applications

Authors: Vivek Rangarajan, Kim G. Clarke

Abstract:

Bacillus lipopeptides are biosurfactants with wide ranging applications in the medical, food, agricultural, environmental and cosmetic industries. They are produced as a mix of three families, surfactin, iturin and fengycin, each comprising a large number of homologues of varying functionalities. Consequently, the method and degree of purification of the lipopeptide cocktail becomes particularly important if the functionality of the lipopeptide end-product is to be maximized for the specific application. However, downstream processing of Bacillus lipopeptides is particularly challenging due to the subtle variations observed in the different lipopeptide homologues and isoforms. To date, the most frequently used lipopeptide purification operations have been acid precipitation, solvent extraction, membrane ultrafiltration, adsorption and size exclusion. RP-HPLC (reverse phase high pressure liquid chromatography) also has potential for fractionation of the lipopeptide homologues. In the studies presented here, membrane ultrafiltration and RP-HPLC were evaluated for lipopeptide purification to different degrees of purities for maximum functionality. Batch membrane ultrafiltration using 50 kDa polyether sulphone (PES) membranes resulted in lipopeptide recovery of about 68% for surfactin and 82 % for fengycin. The recovery was further improved to 95% by using size-conditioned lipopeptide micelles. The conditioning of lipopeptides with Ca2+ ions resulted in uniformly sized micelles with average size of 96.4 nm and a polydispersity index of 0.18. The size conditioning also facilitated removal of impurities (molecular weight ranging between 2335-3500 Da) through operation of the system under dia-filtration mode, in a way similar to salt removal from protein by dialysis. The resultant purified lipopeptide was devoid of macromolecular impurities and could ideally suit applications in the cosmetic and food industries. Enhanced purification using RP-HPLC was carried out in an analytical C18 column, with the aim to fractionate lipopeptides into their constituent homologues. The column was eluted with mobile phase comprising acetonitrile and water over an acetonitrile gradient, 35% - 80%, over 70 minutes. The gradient elution program resulted in as many as 41 fractions of individual lipopeptide homologues. The efficacy test of these fractions against fungal phytopathogens showed that first 21 fractions, identified to be homologues of iturins and fengycins, displayed maximum antifungal activities, suitable for biocontrol in the agricultural industry. Thus, in the current study, the downstream processing of lipopeptides leading to tailor-made products for selective applications was demonstrated using two major downstream unit operations.

Keywords: bacillus lipopeptides, membrane ultrafiltration, purification, RP-HPLC

Procedia PDF Downloads 193
24245 Predictive Maintenance: Machine Condition Real-Time Monitoring and Failure Prediction

Authors: Yan Zhang

Abstract:

Predictive maintenance is a technique to predict when an in-service machine will fail so that maintenance can be planned in advance. Analytics-driven predictive maintenance is gaining increasing attention in many industries such as manufacturing, utilities, aerospace, etc., along with the emerging demand of Internet of Things (IoT) applications and the maturity of technologies that support Big Data storage and processing. This study aims to build an end-to-end analytics solution that includes both real-time machine condition monitoring and machine learning based predictive analytics capabilities. The goal is to showcase a general predictive maintenance solution architecture, which suggests how the data generated from field machines can be collected, transmitted, stored, and analyzed. We use a publicly available aircraft engine run-to-failure dataset to illustrate the streaming analytics component and the batch failure prediction component. We outline the contributions of this study from four aspects. First, we compare the predictive maintenance problems from the view of the traditional reliability centered maintenance field, and from the view of the IoT applications. When evolving to the IoT era, predictive maintenance has shifted its focus from ensuring reliable machine operations to improve production/maintenance efficiency via any maintenance related tasks. It covers a variety of topics, including but not limited to: failure prediction, fault forecasting, failure detection and diagnosis, and recommendation of maintenance actions after failure. Second, we review the state-of-art technologies that enable a machine/device to transmit data all the way through the Cloud for storage and advanced analytics. These technologies vary drastically mainly based on the power source and functionality of the devices. For example, a consumer machine such as an elevator uses completely different data transmission protocols comparing to the sensor units in an environmental sensor network. The former may transfer data into the Cloud via WiFi directly. The latter usually uses radio communication inherent the network, and the data is stored in a staging data node before it can be transmitted into the Cloud when necessary. Third, we illustrate show to formulate a machine learning problem to predict machine fault/failures. By showing a step-by-step process of data labeling, feature engineering, model construction and evaluation, we share following experiences: (1) what are the specific data quality issues that have crucial impact on predictive maintenance use cases; (2) how to train and evaluate a model when training data contains inter-dependent records. Four, we review the tools available to build such a data pipeline that digests the data and produce insights. We show the tools we use including data injection, streaming data processing, machine learning model training, and the tool that coordinates/schedules different jobs. In addition, we show the visualization tool that creates rich data visualizations for both real-time insights and prediction results. To conclude, there are two key takeaways from this study. (1) It summarizes the landscape and challenges of predictive maintenance applications. (2) It takes an example in aerospace with publicly available data to illustrate each component in the proposed data pipeline and showcases how the solution can be deployed as a live demo.

Keywords: Internet of Things, machine learning, predictive maintenance, streaming data

Procedia PDF Downloads 372
24244 Research Trends on Magnetic Graphene for Water Treatment: A Bibliometric Analysis

Authors: J. C. M. Santos, J. C. A. Sousa, A. J. Rubio, L. S. Soletti, F. Gasparotto, N. U. Yamaguchi

Abstract:

Magnetic graphene has received widespread attention for their capability of water and wastewater treatment, which has been attracted many researchers in this field. A bibliometric analysis based on the Web of Science database was employed to analyze the global scientific outputs of magnetic graphene for water treatment until the present time (2012 to 2017), to improve the understanding of the research trends. The publication year, place of publication, institutes, funding agencies, journals, most cited articles, distribution outputs in thematic categories and applications were analyzed. Three major aspects analyzed including type of pollutant, treatment process and composite composition have further contributed to revealing the research trends. The most relevant research aspects of the main technologies using magnetic graphene for water treatment were summarized in this paper. The results showed that research on magnetic graphene for water treatment goes through a period of decline that might be related to a saturated field and a lack of bibliometric studies. Thus, the result of the present work will lead researchers to establish future directions in further studies using magnetic graphene for water treatment.

Keywords: composite, graphene oxide, nanomaterials, scientometrics

Procedia PDF Downloads 234
24243 Farm Diversification and the Corresponding Policy for Its Implementation in Georgia

Authors: E. Kharaishvili

Abstract:

The paper shows the necessity of farm diversification in accordance with the current trends in agricultural sector of Georgia. The possibilities for the diversification and the corresponding economic policy are suggested. The causes that hinder diversification of farms are revealed, possibilities of diversification are suggested and the ability of increasing employment through diversification is proved. Index of harvest diversification is calculated based on the areas used for cereals and legumes, potatoes and vegetables and other food crops. Crop and livestock production indexes are analyzed, correlation between crop capacity index and value-added per one worker and one ha is studied. Based on the research farm diversification strategies and priorities of corresponding economic policy are presented. Based on the conclusions relevant recommendations are suggested.

Keywords: farm diversification, diversification index, agricultural development policy

Procedia PDF Downloads 449
24242 Road Condition Monitoring Using Built-in Vehicle Technology Data, Drones, and Deep Learning

Authors: Judith Mwakalonge, Geophrey Mbatta, Saidi Siuhi, Gurcan Comert, Cuthbert Ruseruka

Abstract:

Transportation agencies worldwide continuously monitor their roads' conditions to minimize road maintenance costs and maintain public safety and rideability quality. Existing methods for carrying out road condition surveys involve manual observations of roads using standard survey forms done by qualified road condition surveyors or engineers either on foot or by vehicle. Automated road condition survey vehicles exist; however, they are very expensive since they require special vehicles equipped with sensors for data collection together with data processing and computing devices. The manual methods are expensive, time-consuming, infrequent, and can hardly provide real-time information for road conditions. This study contributes to this arena by utilizing built-in vehicle technologies, drones, and deep learning to automate road condition surveys while using low-cost technology. A single model is trained to capture flexible pavement distresses (Potholes, Rutting, Cracking, and raveling), thereby providing a more cost-effective and efficient road condition monitoring approach that can also provide real-time road conditions. Additionally, data fusion is employed to enhance the road condition assessment with data from vehicles and drones.

Keywords: road conditions, built-in vehicle technology, deep learning, drones

Procedia PDF Downloads 99
24241 Enhancing Student Learning Outcomes Using Engineering Design Process: Case Study in Physics Course

Authors: Thien Van Ngo

Abstract:

The engineering design process is a systematic approach to solving problems. It involves identifying a problem, brainstorming solutions, prototyping and testing solutions, and evaluating the results. The engineering design process can be used to teach students how to solve problems in a creative and innovative way. The research aim of this study was to investigate the effectiveness of using the engineering design process to enhance student learning outcomes in a physics course. A mixed research method was used in this study. The quantitative data were collected using a pretest-posttest control group design. The qualitative data were collected using semi-structured interviews. The sample was 150 first-year students in the Department of Mechanical Engineering Technology at Cao Thang Technical College in Vietnam in the 2022-2023 school year. The quantitative data were collected using a pretest-posttest control group design. The pretest was administered to both groups at the beginning of the study. The posttest was administered to both groups at the end of the study. The qualitative data were collected using semi-structured interviews with a sample of eight students in the experimental group. The interviews were conducted after the posttest. The quantitative data were analyzed using independent sample T-tests. The qualitative data were analyzed using thematic analysis. The quantitative data showed that students in the experimental group, who were taught using the engineering design process, had significantly higher post-test scores on physics problem-solving than students in the control group, who were taught using the conventional method. The qualitative data showed that students in the experimental group were more motivated and engaged in the learning process than students in the control group. Students in the experimental group also reported that they found the engineering design process to be a more effective way of learning physics. The findings of this study suggest that the engineering design process can be an effective way of enhancing student learning outcomes in physics courses. The engineering design process engages students in the learning process and helps them to develop problem-solving skills.

Keywords: engineering design process, problem-solving, learning outcome of physics, students’ physics competencies, deep learning

Procedia PDF Downloads 57
24240 Using Business Intelligence Capabilities to Improve the Quality of Decision-Making: A Case Study of Mellat Bank

Authors: Jalal Haghighat Monfared, Zahra Akbari

Abstract:

Today, business executives need to have useful information to make better decisions. Banks have also been using information tools so that they can direct the decision-making process in order to achieve their desired goals by rapidly extracting information from sources with the help of business intelligence. The research seeks to investigate whether there is a relationship between the quality of decision making and the business intelligence capabilities of Mellat Bank. Each of the factors studied is divided into several components, and these and their relationships are measured by a questionnaire. The statistical population of this study consists of all managers and experts of Mellat Bank's General Departments (including 190 people) who use commercial intelligence reports. The sample size of this study was 123 randomly determined by statistical method. In this research, relevant statistical inference has been used for data analysis and hypothesis testing. In the first stage, using the Kolmogorov-Smirnov test, the normalization of the data was investigated and in the next stage, the construct validity of both variables and their resulting indexes were verified using confirmatory factor analysis. Finally, using the structural equation modeling and Pearson's correlation coefficient, the research hypotheses were tested. The results confirmed the existence of a positive relationship between decision quality and business intelligence capabilities in Mellat Bank. Among the various capabilities, including data quality, correlation with other systems, user access, flexibility and risk management support, the flexibility of the business intelligence system was the most correlated with the dependent variable of the present research. This shows that it is necessary for Mellat Bank to pay more attention to choose the required business intelligence systems with high flexibility in terms of the ability to submit custom formatted reports. Subsequently, the quality of data on business intelligence systems showed the strongest relationship with quality of decision making. Therefore, improving the quality of data, including the source of data internally or externally, the type of data in quantitative or qualitative terms, the credibility of the data and perceptions of who uses the business intelligence system, improves the quality of decision making in Mellat Bank.

Keywords: business intelligence, business intelligence capability, decision making, decision quality

Procedia PDF Downloads 101
24239 Modelling of Geotechnical Data Using Geographic Information System and MATLAB for Eastern Ahmedabad City, Gujarat

Authors: Rahul Patel

Abstract:

Ahmedabad, a city located in western India, is experiencing rapid growth due to urbanization and industrialization. It is projected to become a metropolitan city in the near future, resulting in various construction activities. Soil testing is necessary before construction can commence, requiring construction companies and contractors to periodically conduct soil testing. The focus of this study is on the process of creating a spatial database that is digitally formatted and integrated with geotechnical data and a Geographic Information System (GIS). Building a comprehensive geotechnical (Geo)-database involves three steps: collecting borehole data from reputable sources, verifying the accuracy and redundancy of the data, and standardizing and organizing the geotechnical information for integration into the database. Once the database is complete, it is integrated with GIS, allowing users to visualize, analyze, and interpret geotechnical information spatially. Using a Topographic to Raster interpolation process in GIS, estimated values are assigned to all locations based on sampled geotechnical data values. The study area was contoured for SPT N-Values, Soil Classification, Φ-Values, and Bearing Capacity (T/m2). Various interpolation techniques were cross-validated to ensure information accuracy. This GIS map enables the calculation of SPT N-Values, Φ-Values, and bearing capacities for different footing widths and various depths. This study highlights the potential of GIS in providing an efficient solution to complex phenomena that would otherwise be tedious to achieve through other means. Not only does GIS offer greater accuracy, but it also generates valuable information that can be used as input for correlation analysis. Furthermore, this system serves as a decision support tool for geotechnical engineers.

Keywords: ArcGIS, borehole data, geographic information system, geo-database, interpolation, SPT N-value, soil classification, Φ-Value, bearing capacity

Procedia PDF Downloads 59
24238 Using TRACE and SNAP Codes to Establish the Model of Maanshan PWR for SBO Accident

Authors: B. R. Shen, J. R. Wang, J. H. Yang, S. W. Chen, C. Shih, Y. Chiang, Y. F. Chang, Y. H. Huang

Abstract:

In this research, TRACE code with the interface code-SNAP was used to simulate and analyze the SBO (station blackout) accident which occurred in Maanshan PWR (pressurized water reactor) nuclear power plant (NPP). There are four main steps in this research. First, the SBO accident data of Maanshan NPP were collected. Second, the TRACE/SNAP model of Maanshan NPP was established by using these data. Third, this TRACE/SNAP model was used to perform the simulation and analysis of SBO accident. Finally, the simulation and analysis of SBO with mitigation equipments was performed. The analysis results of TRACE are consistent with the data of Maanshan NPP. The mitigation equipments of Maanshan can maintain the safety of Maanshan in the SBO according to the TRACE predictions.

Keywords: pressurized water reactor (PWR), TRACE, station blackout (SBO), Maanshan

Procedia PDF Downloads 177
24237 Structural and Phase Transformations of Pure and Silica Treated Nanofibrous Al₂O₃

Authors: T. H. N. Nguyen, A. Khodan, M. Amamra, J-V. Vignes, A. Kanaev

Abstract:

The ultraporous nanofibrous alumina (NOA, Al2O3·nH2O) was synthesized by oxidation of laminated aluminium plates through a liquid mercury-silver layer in a humid atmosphere ~80% at 25°C. The material has an extremely high purity (99%), porosity (90%) and specific area (300 m2/g). The subsequent annealing of raw NOA permits obtaining pure transition phase (γ and θ) nanostructured materials. In this combination, we report on chemical, structural and phase transformations of pure and modified NOA by an impregnation of trimethylethoxysilane (TMES) and tetraethoxysilane (TEOS) during thermal annealing in the temperature range between 20 and 1650°C. The mass density, specific area, average diameter and specific area are analysed. The 3D model of pure NOA monoliths and silica modified NOA is proposed, which successfully describes the evolution of specific area, mass density and phase transformations. Activation energies of the mass transport in two regimes of surface diffusion and bulk sintering were obtained based on this model. We conclude about a common origin of modifications of the NOA morphology, chemical composition and phase transition.

Keywords: nanostructured materials, alumina (Al₂O₃), morphology, phase transitions

Procedia PDF Downloads 370
24236 Investigation of Nd-Al-Fe Added Nd-Fe-B Alloy Produced by Arc Melting

Authors: Gülten Sadullahoğlu, Baki Altuncevahir

Abstract:

The scope of this study, to investigate the magnetic properties and microstructure of Nd₂Fe₁₄B₁ by alloying with Nd₃₃.₄Fe₆₂.₆Al₄, and heat treating it at different temperatures. The stoichiometric Nd₂Fe₁₄B hard magnetic alloy and Nd₃₃.₄Fe₆₂.₆Al₄ composition was produced by arc melting under argon atmosphere. The Nd₃₃.₄Fe₆₂.₆Al₄ alloy has added to the 2:14:1 hard magnetic alloy with 48% by weight, and melted again by arc melting. Then, it was heat treated at 600, 700 and 800˚C for 3h under vacuum. In AC magnetic susceptibility measurements, for the as-cast sample, the signals decreased sharply at 101 ˚C and 313 ˚C corresponding to the Curie temperatures of the two ferromagnetic phases in addition to Fe phase. For the sample annealed at 600 ˚C, two Curie points were observed at about 257˚C and at 313˚C. However, the phase corresponding to the Curie temperature of 101 ˚C was disappeared. According to the magnetization measurements, the saturation magnetization has the highest value of 99.8 emu/g for the sample annealed at 600 ˚C, and decreased to 57.66 and 28.6 emu/g for the samples annealed at 700˚ and 800 ˚C respectively. Heat treatment resulted in an evolution of the new phase that caused changes in magnetic properties of the alloys. In order to have a clear picture, the identification of these phases are being under the investigation by XRD and SEM–EDX analysis.

Keywords: NdFeB hard magnets, bulk magnetic materials, arc melting, Curie temperature, heat treatment

Procedia PDF Downloads 268
24235 A Comparative and Doctrinal Analysis towards the Investigation of a Right to Be Forgotten in Hong Kong

Authors: Jojo Y. C. Mo

Abstract:

Memories are good. They remind us of people, places and experiences that we cherish. But memories cannot be changed and there may well be memories that we do not want to remember. This is particularly true in relation to information which causes us embarrassment and humiliation or simply because it is private – we all want to erase or delete such information. This desire to delete is recently recognised by the Court of Justice of the European Union in the 2014 case of Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González in which the court ordered Google to remove links to some information about the complainant which he wished to be removed. This so-called ‘right to be forgotten’ received serious attention and significantly, the European Council and the European Parliament enacted the General Data Protection Regulation (GDPR) to provide a more structured and normative framework for implementation of right to be forgotten across the EU. This development in data protection laws will, undoubtedly, have significant impact on companies and co-operations not just within the EU but outside as well. Hong Kong, being one of the world’s leading financial and commercial center as well as one of the first jurisdictions in Asia to implement a comprehensive piece of data protection legislation, is therefore a jurisdiction that is worth looking into. This article/project aims to investigate the following: a) whether there is a right to be forgotten under the existing Hong Kong data protection legislation b) if not, whether such a provision is necessary and why. This article utilises a comparative methodology based on a study of primary and secondary resources, including scholarly articles, government and law commission reports and working papers and relevant international treaties, constitutional documents, case law and legislation. The author will primarily engage literature and case-law review as well as comparative and doctrinal analyses. The completion of this article will provide privacy researchers with more concrete principles and data to conduct further research on privacy and data protection in Hong Kong and internationally and will provide a basis for policy makers in assessing the rationale and need for a right to be forgotten in Hong Kong.

Keywords: privacy, right to be forgotten, data protection, Hong Kong

Procedia PDF Downloads 171
24234 Damage Assessment Based on Full-Polarimetric Decompositions in the 2017 Colombia Landslide

Authors: Hyeongju Jeon, Yonghyun Kim, Yongil Kim

Abstract:

Synthetic Aperture Radar (SAR) is an effective tool for damage assessment induced by disasters due to its all-weather and night/day acquisition capability. In this paper, the 2017 Colombia landslide was observed using full-polarimetric ALOS/PALSAR-2 data. Polarimetric decompositions, including the Freeman-Durden decomposition and the Cloude decomposition, are utilized to analyze the scattering mechanisms changes before and after-landslide. These analyses are used to detect the damaged areas induced by the landslide. Experimental results validate the efficiency of the full polarimetric SAR data since the damaged areas can be well discriminated. Thus, we can conclude the proposed method using full polarimetric data has great potential for damage assessment of landslides.

Keywords: Synthetic Aperture Radar (SAR), polarimetric decomposition, damage assessment, landslide

Procedia PDF Downloads 377
24233 Using Historical Data for Stock Prediction

Authors: Sofia Stoica

Abstract:

In this paper, we use historical data to predict the stock price of a tech company. To this end, we use a dataset consisting of the stock prices in the past five years of ten major tech companies – Adobe, Amazon, Apple, Facebook, Google, Microsoft, Netflix, Oracle, Salesforce, and Tesla. We experimented with a variety of models– a linear regressor model, K nearest Neighbors (KNN), a sequential neural network – and algorithms - Multiplicative Weight Update, and AdaBoost. We found that the sequential neural network performed the best, with a testing error of 0.18%. Interestingly, the linear model performed the second best with a testing error of 0.73%. These results show that using historical data is enough to obtain high accuracies, and a simple algorithm like linear regression has a performance similar to more sophisticated models while taking less time and resources to implement.

Keywords: finance, machine learning, opening price, stock market

Procedia PDF Downloads 162
24232 Supervised Learning for Cyber Threat Intelligence

Authors: Jihen Bennaceur, Wissem Zouaghi, Ali Mabrouk

Abstract:

The major aim of cyber threat intelligence (CTI) is to provide sophisticated knowledge about cybersecurity threats to ensure internal and external safeguards against modern cyberattacks. Inaccurate, incomplete, outdated, and invaluable threat intelligence is the main problem. Therefore, data analysis based on AI algorithms is one of the emergent solutions to overcome the threat of information-sharing issues. In this paper, we propose a supervised machine learning-based algorithm to improve threat information sharing by providing a sophisticated classification of cyber threats and data. Extensive simulations investigate the accuracy, precision, recall, f1-score, and support overall to validate the designed algorithm and to compare it with several supervised machine learning algorithms.

Keywords: threat information sharing, supervised learning, data classification, performance evaluation

Procedia PDF Downloads 134
24231 A Fundamental Study on the Molecular Chemistry of Agarwood Water Mixture

Authors: Fatmawati Adam, Saidatul Syaima Mat Tari, Saiful Nizam Tajuddin, Nurul Salwa Azliyana Hamzah

Abstract:

Essential oil of agarwood or known as Gaharu in Malay is highly prized for its value as luxury fragrances and incense. However, the complexities of the chemical composition of agarwood itself is the main challenge for establishment of an effective recovery method, which is able to ensure uniform qualities and standard for each batch of essential oil production. Agarwood markers are actually a blend of volatile and non-volatile compounds. While volatile molecules could be easily retrieved by the present distillation technique, the high solubility properties are the limiting factor for the latter. With regard to this, an elementary chemistry resolution study had been performed on commercial agarwood essential oil-water mixture, by the application of preparative HPLC and FTIR. Interpretation of the results leads to the theoretical postulation that, agarwood water mixture comprise of agarospirol, jinkohol, jinkoh eremol and khusenol. This study provides a pinpoint on the chemical characteristics of water soluble (non-volatile) agarwood compounds, therefore, will be an insight for researchers to develop a more strategic technique for their extraction. Thereafter the optimum quality of this essential oil could be controlled in a more improved way.

Keywords: Agarwood, Aquillaria Malaccensis, agarospirol, jinkohol, jinkoh eremol, khusenol

Procedia PDF Downloads 535
24230 The Orthodox Church's Heritage in Syria and the Journey of Syriac Music between Originality and Renewal

Authors: Marilyn Maksoud

Abstract:

This article discusses the heritage of the Orthodox Church, additionally it describes the origins, composition, and characteristics of the Orthodox Christian cultural identity in Syria and the liturgical traditions of the Church in the literature. Also, the eight tunes and their original use, the historical and anthropological importance of the most important Orthodox churches in Syria, were discussed. Finally, the role and works of the composer Nuri Iskandar in reviving Christian music were mentioned. "Cultural dialogue" methodology based on the recognition of equal cultures, practical and bibliographic sources of books and articles in many languages German, French, Arabic, and English, in addition to my practical experience in chanting the Syriac Aramaic language in some churches in Syria and Russia. This study concluded that the roots of the characteristics of Orthodox Christian culture in Syria go back to the original eight Syriac melodies. Additionally, The originality of Major and Minor scales were tracked as an extension of Syriac Christian melodies originated thousands of years ago in Syrian land.

Keywords: church culture in Syria, Syriac orthodox music, Syriac orthodox church, Aramaic semitic language, Syriac, Syrian church melodies

Procedia PDF Downloads 169
24229 Methodologies, Findings, Discussion, and Limitations in Global, Multi-Lingual Research: We Are All Alone - Chinese Internet Drama

Authors: Patricia Portugal Marques de Carvalho Lourenco

Abstract:

A three-phase methodological multi-lingual path was designed, constructed and carried out using the 2020 Chinese Internet Drama Series We Are All Alone as a case study. Phase one, the backbone of the research, comprised of secondary data analysis, providing the structure on which the next two phases would be built on. Phase one incorporated a Google Scholar and a Baidu Index analysis, Star Network Influence Index and Mydramalist.com top two drama reviews, along with an article written about the drama and scrutiny of Chinese related blogs and websites. Phase two was field research elaborated across Latin Europe, and phase three was social media focused, having into account that perceptions are going to be memory conditioned based on past ideas recall. Overall, research has shown the poor cultural expression of Chinese entertainment in Latin Europe and demonstrated the inexistence of Chinese content in French, Italian, Portuguese and Spanish Business to Consumer retailers; a reflection of their low significance in Latin European markets and the short-life cycle of entertainment products in general, bubble-gum, disposable goods without a mid to long-term effect in consumers lives. The process of conducting comprehensive international research was complex and time-consuming, with data not always available in Mandarin, the researcher’s linguistic deficiency, limited Chinese Cultural Knowledge and cultural equivalence. Despite steps being taken to minimize the international proposed research, theoretical limitations concurrent to Latin Europe and China still occurred. Data accuracy was disputable; sampling, data collection/analysis methods are heterogeneous; ascertaining data requirements and the method of analysis to achieve a construct equivalence was challenging and morose to operationalize. Secondary data was also not often readily available in Mandarin; yet, in spite of the array of limitations, research was done, and results were produced.

Keywords: research methodologies, international research, primary data, secondary data, research limitations, online dramas, china, latin europe

Procedia PDF Downloads 61
24228 Analysis of Bio-Oil Produced by Pyrolysis of Coconut Shell

Authors: D. S. Fardhyanti, A. Damayanti

Abstract:

The utilization of biomass as a source of new and renewable energy is being carried out. One of the technologies to convert biomass as an energy source is pyrolysis which is converting biomass into more valuable products, such as bio-oil. Bio-oil is a liquid which is produced by steam condensation process from the pyrolysis of coconut shells. The composition of a coconut shell e.g. hemicellulose, cellulose and lignin will be oxidized to phenolic compounds as the main component of the bio-oil. The phenolic compounds in bio-oil are corrosive; they cause various difficulties in the combustion system because of a high viscosity, low calorific value, corrosiveness, and instability. Phenolic compounds are very valuable components which phenol has used as the main component for the manufacture of antiseptic, disinfectant (known as Lysol) and deodorizer. The experiments typically occurred at the atmospheric pressure in a pyrolysis reactor at temperatures ranging from 300 oC to 350 oC with a heating rate of 10 oC/min and a holding time of 1 hour at the pyrolysis temperature. The Gas Chromatography-Mass Spectroscopy (GC-MS) was used to analyze the bio-oil components. The obtained bio-oil has the viscosity of 1.46 cP, the density of 1.50 g/cm3, the calorific value of 16.9 MJ/kg, and the molecular weight of 1996.64. By GC-MS, the analysis of bio-oil showed that it contained phenol (40.01%), ethyl ester (37.60%), 2-methoxy-phenol (7.02%), furfural (5.45%), formic acid (4.02%), 1-hydroxy-2-butanone (3.89%), and 3-methyl-1,2-cyclopentanedione (2.01%).

Keywords: bio-oil, pyrolysis, coconut shell, phenol, gas chromatography-mass spectroscopy

Procedia PDF Downloads 229
24227 Node Insertion in Coalescence Hidden-Variable Fractal Interpolation Surface

Authors: Srijanani Anurag Prasad

Abstract:

The Coalescence Hidden-variable Fractal Interpolation Surface (CHFIS) was built by combining interpolation data from the Iterated Function System (IFS). The interpolation data in a CHFIS comprises a row and/or column of uncertain values when a single point is entered. Alternatively, a row and/or column of additional points are placed in the given interpolation data to demonstrate the node added CHFIS. There are three techniques for inserting new points that correspond to the row and/or column of nodes inserted, and each method is further classified into four types based on the values of the inserted nodes. As a result, numerous forms of node insertion can be found in a CHFIS.

Keywords: fractal, interpolation, iterated function system, coalescence, node insertion, knot insertion

Procedia PDF Downloads 86
24226 Advertising Message Strategy on Ghana’s TV

Authors: Aisha Iddrisu, Ferruh Uztuğ

Abstract:

This study is a quantitative content analysis of advertising message strategies used in Ghana’s TV commercials (2020-2021) using the modified strategy of Wang and Praet (2016) with the objective of exploring the various advertising message strategies used in Ghana’s TV advertisement, its variation according to product category including the most widely used message strategy. The findings indicate that, out of the 220 commercials used in the study, the Affective message strategy (n=122, 55%) was the dominant message strategy used in Ghana’s TV commercials. The most advertised product category in Ghana’s TV commercials (2020-2021) was the food category, and a significant relationship was observed between message strategy and product category as well as message strategy and brand type.

Keywords: advertising, message strategy, Ghana, television

Procedia PDF Downloads 166
24225 "Epitaph" Charles Mingus’ Foresight of Jazz

Authors: Christel Elisabeth Bonin

Abstract:

The score of the 2 ½ hour ‘magnum opus’ named ‘Epitaph’ was reconstructed 10 years after Charles Mingus’ death in 1979. Most of the movements were probably composed in the late 1950s. As the finale was missing, Gunther Schuller, the conductor of the world premiere in 1989, decided to improvise one with the orchestra, using Mingus as a guide. The aim of this paper is to analyze ‘Main Score Part I ‘ and ‘Main Score Part II’ and to look into the score of Mingus’ reconstructed compositions under particular observation of the new finale, ‘Main Score Reprise’. There, Mingus left instructions for a return to the opening section of ‘Epitaph’. By examining ‘Epitaph’ in the historical context of Jazz between 1955 to 1967 and the 1980s and comparing the finale of ‘Epitaph’, created - or better said: improvised - by the musicians of the 1989 world premiere with the opening section, at first it will be interesting to discover at which point Gunther Schuller followed Mingus creative process and brought it to life in 1989. Finally, it will be speculated if Charles Mingus composition still represents a foresight of Jazz nearly 30 years after its creation.

Keywords: epitaph, Charles Mingus, Gunter Schuller, jazz reception, bebop, hardbop, Duke Ellington, black, brown and beige, African-American music, free-jazz

Procedia PDF Downloads 298
24224 Optimizing the Efficiency of Measuring Instruments in Ouagadougou-Burkina Faso

Authors: Moses Emetere, Marvel Akinyemi, S. E. Sanni

Abstract:

At the moment, AERONET or AMMA database shows a large volume of data loss. With only about 47% data set available to the scientist, it is evident that accurate nowcast or forecast cannot be guaranteed. The calibration constants of most radiosonde or weather stations are not compatible with the atmospheric conditions of the West African climate. A dispersion model was developed to incorporate salient mathematical representations like a Unified number. The Unified number was derived to describe the turbulence of the aerosols transport in the frictional layer of the lower atmosphere. Fourteen years data set from Multi-angle Imaging SpectroRadiometer (MISR) was tested using the dispersion model. A yearly estimation of the atmospheric constants over Ouagadougou using the model was obtained with about 87.5% accuracy. It further revealed that the average atmospheric constant for Ouagadougou-Niger is a_1 = 0.626, a_2 = 0.7999 and the tuning constants is n_1 = 0.09835 and n_2 = 0.266. Also, the yearly atmospheric constants affirmed the lower atmosphere of Ouagadougou is very dynamic. Hence, it is recommended that radiosonde and weather station manufacturers should constantly review the atmospheric constant over a geographical location to enable about eighty percent data retrieval.

Keywords: aerosols retention, aerosols loading, statistics, analytical technique

Procedia PDF Downloads 295
24223 Modern Imputation Technique for Missing Data in Linear Functional Relationship Model

Authors: Adilah Abdul Ghapor, Yong Zulina Zubairi, Rahmatullah Imon

Abstract:

Missing value problem is common in statistics and has been of interest for years. This article considers two modern techniques in handling missing data for linear functional relationship model (LFRM) namely the Expectation-Maximization (EM) algorithm and Expectation-Maximization with Bootstrapping (EMB) algorithm using three performance indicators; namely the mean absolute error (MAE), root mean square error (RMSE) and estimated biased (EB). In this study, we applied the methods of imputing missing values in the LFRM. Results of the simulation study suggest that EMB algorithm performs much better than EM algorithm in both models. We also illustrate the applicability of the approach in a real data set.

Keywords: expectation-maximization, expectation-maximization with bootstrapping, linear functional relationship model, performance indicators

Procedia PDF Downloads 379
24222 Malignancy Assessment of Brain Tumors Using Convolutional Neural Network

Authors: Chung-Ming Lo, Kevin Li-Chun Hsieh

Abstract:

The central nervous system in the World Health Organization defines grade 2, 3, 4 gliomas according to the aggressiveness. For brain tumors, using image examination would have a lower risk than biopsy. Besides, it is a challenge to extract relevant tissues from biopsy operation. Observing the whole tumor structure and composition can provide a more objective assessment. This study further proposed a computer-aided diagnosis (CAD) system based on a convolutional neural network to quantitatively evaluate a tumor's malignancy from brain magnetic resonance imaging. A total of 30 grade 2, 43 grade 3, and 57 grade 4 gliomas were collected in the experiment. Transferred parameters from AlexNet were fine-tuned to classify the target brain tumors and achieved an accuracy of 98% and an area under the receiver operating characteristics curve (Az) of 0.99. Without pre-trained features, only 61% of accuracy was obtained. The proposed convolutional neural network can accurately and efficiently classify grade 2, 3, and 4 gliomas. The promising accuracy can provide diagnostic suggestions to radiologists in the clinic.

Keywords: convolutional neural network, computer-aided diagnosis, glioblastoma, magnetic resonance imaging

Procedia PDF Downloads 131