Search results for: Analytic Hierarchy Processing (AHP)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4438

Search results for: Analytic Hierarchy Processing (AHP)

1408 Random Forest Classification for Population Segmentation

Authors: Regina Chua

Abstract:

To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.

Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling

Procedia PDF Downloads 92
1407 Production and Characterization of Ce3+: Si2N2O Phosphors for White Light-Emitting Diodes

Authors: Alparslan A. Balta, Hilmi Yurdakul, Orkun Tunckan, Servet Turan, Arife Yurdakul

Abstract:

Si2N2O (Sinoite) is an inorganic-based oxynitride material that reveals promising phosphor candidates for white light-emitting diodes (WLEDs). However, there is now limited knowledge to explain the synthesis of Si2N2O for this purpose. Here, to the best of authors’ knowledge, we report the first time the production of Si2N2O based phosphors by CeO2, SiO2, Si3N4 from main starting powders, and Li2O sintering additive through spark plasma sintering (SPS) route. The processing parameters, e.g., pressure, temperature, and sintering time, were optimized to reach the monophase Si2N2O containing samples. The lattice parameter, crystallite size, and amount of formation phases were characterized in detail by X-ray diffraction (XRD). Grain morphology, particle size, and distribution were analyzed by scanning and transmission electron microscopes (SEM and TEM). Cathodoluminescence (CL) in SEM and photoluminescence (PL) analyses were conducted on the samples to determine the excitation, and emission characteristics of Ce3+ activated Si2N2O. Results showed that the Si2N2O phase in a maximum 90% ratio was obtained by sintering for 15 minutes at 1650oC under 30 MPa pressure. Based on the SEM-CL and PL measurements, Ce3+: Si2N2O phosphor shows a broad emission summit between 400-700 nm that corresponds to white light. The present research was supported by TUBITAK under project number 217M667.

Keywords: cerium, oxynitride, phosphors, sinoite, Si₂N₂O

Procedia PDF Downloads 104
1406 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms

Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao

Abstract:

Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.

Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50

Procedia PDF Downloads 137
1405 Segmentation of Liver Using Random Forest Classifier

Authors: Gajendra Kumar Mourya, Dinesh Bhatia, Akash Handique, Sunita Warjri, Syed Achaab Amir

Abstract:

Nowadays, Medical imaging has become an integral part of modern healthcare. Abdominal CT images are an invaluable mean for abdominal organ investigation and have been widely studied in the recent years. Diagnosis of liver pathologies is one of the major areas of current interests in the field of medical image processing and is still an open problem. To deeply study and diagnose the liver, segmentation of liver is done to identify which part of the liver is mostly affected. Manual segmentation of the liver in CT images is time-consuming and suffers from inter- and intra-observer differences. However, automatic or semi-automatic computer aided segmentation of the Liver is a challenging task due to inter-patient Liver shape and size variability. In this paper, we present a technique for automatic segmenting the liver from CT images using Random Forest Classifier. Random forests or random decision forests are an ensemble learning method for classification that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes of the individual trees. After comparing with various other techniques, it was found that Random Forest Classifier provide a better segmentation results with respect to accuracy and speed. We have done the validation of our results using various techniques and it shows above 89% accuracy in all the cases.

Keywords: CT images, image validation, random forest, segmentation

Procedia PDF Downloads 312
1404 Bioavailability of Iron in Some Selected Fiji Foods using In vitro Technique

Authors: Poonam Singh, Surendra Prasad, William Aalbersberg

Abstract:

Iron the most essential trace element in human nutrition. Its deficiency has serious health consequences and is a major public health threat worldwide. The common deficiencies in Fiji population reported are of Fe, Ca and Zn. It has also been reported that 40% of women in Fiji are iron deficient. Therefore, we have been studying the bioavailability of iron in commonly consumed Fiji foods. To study the bioavailability it is essential to assess the iron contents in raw foods. This paper reports the iron contents and its bioavailability in commonly consumed foods by multicultural population of Fiji. The food samples (rice, breads, wheat flour and breakfast cereals) were analyzed by atomic absorption spectrophotometer for total iron and its bioavailability. The white rice had the lowest total iron 0.10±0.03 mg/100g but had high bioavailability of 160.60±0.03%. The brown rice had 0.20±0.03 mg/100g total iron content but 85.00±0.03% bioavailable. The white and brown breads showed the highest iron bioavailability as 428.30±0.11 and 269.35 ±0.02%, respectively. The Weetabix and the rolled oats had the iron contents 2.89±0.27 and 1.24.±0.03 mg/100g with bioavailability of 14.19±0.04 and 12.10±0.03%, respectively. The most commonly consumed normal wheat flour had 0.65±0.00 mg/100g iron while the whole meal and the Roti flours had 2.35±0.20 and 0.62±0.17 mg/100g iron showing bioavailability of 55.38±0.05, 16.67±0.08 and 12.90±0.00%, respectively. The low bioavailability of iron in certain foods may be due to the presence of phytates/oxalates, processing/storage conditions, cooking method or interaction with other minerals present in the food samples.

Keywords: iron, bioavailability, Fiji foods, in vitro technique, human nutrition

Procedia PDF Downloads 529
1403 Functional Gene Expression in Human Cells Using Linear Vectors Derived from Bacteriophage N15 Processing

Authors: Kumaran Narayanan, Pei-Sheng Liew

Abstract:

This paper adapts the bacteriophage N15 protelomerase enzyme to assemble linear chromosomes as vectors for gene expression in human cells. Phage N15 has the unique ability to replicate as a linear plasmid with telomeres in E. coli during its prophage stage of life-cycle. The virus-encoded protelomerase enzyme cuts its circular genome and caps its ends to form hairpin telomeres, resulting in a linear human-chromosome-like structure in E. coli. In mammalian cells, however, no enzyme with TelN-like activities has been found. In this work, we show for the first-time transfer of the protelomerase from phage into human and mouse cells and demonstrate recapitulation of its activity in these hosts. The function of this enzyme is assayed by demonstrating cleavage of its target DNA, followed by detecting telomere formation based on its resistance to recBCD enzyme digestion. We show protelomerase expression persists for at least 60 days, which indicates limited silencing of its expression. Next, we show that an intact human β-globin gene delivered on this linear chromosome accurately retains its expression in the human cellular environment for at least 60 hours, demonstrating its stability and potential as a vector. These results demonstrate that the N15 protelomerse is able to function in mammalian cells to cut and heal DNA to create telomeres, which provides a new tool for creating novel structures by DNA resolution in these hosts.

Keywords: chromosome, beta-globin, DNA, gene expression, linear vector

Procedia PDF Downloads 190
1402 Influence of κ-Casein Genotype on Milk Productivity of Latvia Local Dairy Breeds

Authors: S. Petrovska, D. Jonkus, D. Smiltiņa

Abstract:

κ-casein is one of milk proteins which are very important for milk processing. Genotypes of κ-casein affect milk yield, fat, and protein content. The main factors which affect local Latvian dairy breed milk yield and composition are analyzed in research. Data were collected from 88 Latvian brown and 82 Latvian blue cows in 2015. AA genotype was 0.557 in Latvian brown and 0.232 in Latvian blue breed. BB genotype was 0.034 in Latvian brown and 0.207 in Latvian blue breed. Highest milk yield was observed in Latvian brown (5131.2 ± 172.01 kg), significantly high fat content and fat yield also was in Latvian brown (p < 0.05). Significant differences between κ-casein genotypes were not found in Latvian brown, but highest milk yield (5057 ± 130.23 kg), protein content (3.42 ± 0.03%), and protein yield (171.9 ± 4.34 kg) were with AB genotype. Significantly high fat content was observed in Latvian blue breed with BB genotype (4.29 ± 0.17%) compared with AA genotypes (3.42 ± 0.19). Similar tendency was found in protein content – 3.27 ± 0.16% with BB genotype and 2.59 ± 0.16% with AA genotype (p < 0.05). Milk yield increases by increasing parity. We did not obtain major tendency of changes of milk fat and protein content according parity.

Keywords: dairy cows, κ-casein, milk productivity, polymorphism

Procedia PDF Downloads 269
1401 A Review on Medical Image Registration Techniques

Authors: Shadrack Mambo, Karim Djouani, Yskandar Hamam, Barend van Wyk, Patrick Siarry

Abstract:

This paper discusses the current trends in medical image registration techniques and addresses the need to provide a solid theoretical foundation for research endeavours. Methodological analysis and synthesis of quality literature was done, providing a platform for developing a good foundation for research study in this field which is crucial in understanding the existing levels of knowledge. Research on medical image registration techniques assists clinical and medical practitioners in diagnosis of tumours and lesion in anatomical organs, thereby enhancing fast and accurate curative treatment of patients. Literature review aims to provide a solid theoretical foundation for research endeavours in image registration techniques. Developing a solid foundation for a research study is possible through a methodological analysis and synthesis of existing contributions. Out of these considerations, the aim of this paper is to enhance the scientific community’s understanding of the current status of research in medical image registration techniques and also communicate to them, the contribution of this research in the field of image processing. The gaps identified in current techniques can be closed by use of artificial neural networks that form learning systems designed to minimise error function. The paper also suggests several areas of future research in the image registration.

Keywords: image registration techniques, medical images, neural networks, optimisaztion, transformation

Procedia PDF Downloads 175
1400 Impact of Modifying the Surface Materials on the Radiative Heat Transfer Phenomenon

Authors: Arkadiusz Urzędowski, Dorota Wójcicka-Migasiuk, Andrzej Sachajdak, Magdalena Paśnikowska-Łukaszuk

Abstract:

Due to the impact of climate changes and inevitability to reduce greenhouse gases, the need to use low-carbon and sustainable construction has increased. In this work, it is investigated how texture of the surface building materials and radiative heat transfer phenomenon in flat multilayer can be correlated. Attempts to test the surface emissivity are taken however, the trustworthiness of measurement results remains a concern since sensor size and thickness are common problems. This paper presents an experimental method to studies surface emissivity with use self constructed thermal sensors and thermal imaging technique. The surface of building materials was modified by mechanical and chemical treatment affecting the reduction of the emissivity. For testing the shaping surface of materials and mapping its three-dimensional structure, scanning profilometry were used in a laboratory. By comparing the results of laboratory tests and performed analysis of 3D computer fluid dynamics software, it can be shown that a change in the surface coverage of materials affects the heat transport by radiation between layers. Motivated by recent advancements in variational inference, this publication evaluates the potential use a dedicated data processing approach, and properly constructed temperature sensors, the influence of the surface emissivity on the phenomenon of radiation and heat transport in the entire partition can be determined.

Keywords: heat transfer, surface roughness, surface emissivity, radiation

Procedia PDF Downloads 94
1399 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights

Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan

Abstract:

The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.

Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being

Procedia PDF Downloads 68
1398 Evaluating Effectiveness of Training and Development Corporate Programs: The Russian Agribusiness Context

Authors: Ekaterina Tikhonova

Abstract:

This research is aimed to evaluate the effectiveness of T&D (Training and Development) on the example of two T&D programs for the Executive TOP Management run in 2012, 2015-2016 in Komos Group. This study is commissioned to research the effectiveness of two similar corporate T&D programs (within one company) in two periods of time (2012, 2015-2016) through evaluating the programs’ effectiveness using the four-level Kirkpatrick’s model of evaluating T&D programs and calculating ROI as an instrument for T&D program measuring by Phillips’ formula. The research investigates the correlation of two figures: the ROI calculated and the rating percentage scale per the ROI implementation (Wagle’s scale). The study includes an assessment of feedback 360 (Kirkpatrick's model) and Phillips’ ROI Methodology that provides a step-by-step process for collecting data, summarizing and processing the collected information. The data is collected from the company accounting data, the HR budgets, MCFO and the company annual reports for the research periods. All analyzed data and reports are organized and presented in forms of tables, charts, and graphs. The paper also gives a brief description of some constrains of the research considered. After ROI calculation, the study reveals that ROI ranges between the average implementation (65% to 75%) by Wagle’s scale that can be considered as a positive outcome. The paper also gives some recommendations how to use ROI in practice and describes main benefits of ROI implementation.

Keywords: ROI, organizational performance, efficacy of T&D program, employee performance

Procedia PDF Downloads 250
1397 Optimization of Ultrasound Assisted Extraction and Characterization of Functional Properties of Dietary Fiber from Oat Cultivar S2000

Authors: Muhammad Suhail Ibrahim, Muhammad Nadeem, Waseem Khalid, Ammara Ainee, Taleeha Roheen, Sadaf Javaria, Aftab Ahmed, Hira Fatima, Mian Nadeem Riaz, Muhammad Zubair Khalid, Isam A. Mohamed Ahmed J, Moneera O. Aljobair

Abstract:

This study was executed to explore the efficacy of ultrasound-assisted extraction of dietary fiber from oat cultivar S2000. Extraction (variables time, temperature and amplitude) was optimized by using response surface methodology (RSM) conducted by Box Behnken Design (BBD). The effect of time, temperature and amplitude were studied at three levels. It was observed that time and temperature exerted more impact on extraction efficiency as compared to amplitude. The highest yield of total dietary fiber (TDF), soluble dietary fiber (SDF) and In-soluble dietary fiber (IDF) fractions were observed under ultrasound processing for 20 min at 40 ◦C with 80% amplitude. Characterization of extracted dietary fiber showed that it had better crystallinity, thermal properties and good fibrous structure. It also showed better functional properties as compared to traditionally extracted dietary fiber. Furthermore, dietary fibers from oats may offer high-value utilization and the expansion of comprehensive utilization in functional food and nutraceutical development.

Keywords: extraction, ultrasonication, response surface methodology, box behnken design

Procedia PDF Downloads 48
1396 Role of mHealth in Effective Response to Disaster

Authors: Mohammad H. Yarmohamadian, Reza Safdari, Nahid Tavakoli

Abstract:

In recent years, many countries have suffered various natural disasters. Disaster response continues to face the challenges in health care sector in all countries. Information and communication management is a significant challenge in disaster scene. During the last decades, rapid advances in information technology have led to manage information effectively and improve communication in health care setting. Information technology is a vital solution for effective response to disasters and emergencies so that if an efficient ICT-based health information system is available, it will be highly valuable in such situation. Of that, mobile technology represents a nearly computing technology infrastructure that is accessible, convenient, inexpensive and easy to use. Most projects have not yet reached the deployment stage, but evaluation exercises show that mHealth should allow faster processing and transport of patients, improved accuracy of triage and better monitoring of unattended patients at a disaster scene. Since there is a high prevalence of cell phones among world population, it is expected the health care providers and managers to take measures for applying this technology for improvement patient safety and public health in disasters. At present there are challenges in the utilization of mhealth in disasters such as lack of structural and financial issues in our country. In this paper we will discuss about benefits and challenges of mhealth technology in disaster setting considering connectivity, usability, intelligibility, communication and teaching for implementing this technology for disaster response.

Keywords: information technology, mhealth, disaster, effective response

Procedia PDF Downloads 440
1395 Balancing the Need for Closure: A Requirement for Effective Mood Development in Flow

Authors: Cristian Andrei Nica

Abstract:

The state of flow relies on cognitive elements that sustain openness for information processing in order to promote goal attainment. However, the need for closure may create mental constraints, which can impact affectivity levels. This study aims to observe the extent in which need for closure moderates the interaction between flow and affectivity, taking into account the mediating role of the mood repair motivation in the interaction process between need for closure and affectivity. Using a non-experimental, correlational design, n=73 participants n=18 men and n=55 women, ages between 19-64 years (m= 28.02) (SD=9.22), completed the Positive Affectivity-Negative Affectivity Schedule, the need for closure scale-revised, the mood repair items and an adapted version of the flow state scale 2, in order to assess the trait aspects of flow. Results show that need for closure significantly moderates the flow-affectivity process, while the tolerance of ambiguity sub-scale is positively associated with negative affectivity and negatively to positive affectivity. At the same time, mood repair motivation significantly mediates the interaction between need for closure and positive affectivity, whereas the mediation process for negative affectivity is insignificant. Need for closure needs to be considered when promoting the development of positive emotions. It has been found that the motivation to repair one’s mood mediates the interaction between need for closure and positive affectivity. According to this study, flow can trigger positive emotions when the person is willing to engage in mood regulation strategies and approach meaningful experiences with an open mind.

Keywords: flow, mood regulation, mood repair motivation, need for closure, negative affectivity, positive affectivity

Procedia PDF Downloads 121
1394 Predictive Maintenance of Industrial Shredders: Efficient Operation through Real-Time Monitoring Using Statistical Machine Learning

Authors: Federico Pittino, Thomas Arnold

Abstract:

The shredding of waste materials is a key step in the recycling process towards the circular economy. Industrial shredders for waste processing operate in very harsh operating conditions, leading to the need for frequent maintenance of critical components. Maintenance optimization is particularly important also to increase the machine’s efficiency, thereby reducing the operational costs. In this work, a monitoring system has been developed and deployed on an industrial shredder located at a waste recycling plant in Austria. The machine has been monitored for one year, and methods for predictive maintenance have been developed for two key components: the cutting knives and the drive belt. The large amount of collected data is leveraged by statistical machine learning techniques, thereby not requiring very detailed knowledge of the machine or its live operating conditions. The results show that, despite the wide range of operating conditions, a reliable estimate of the optimal time for maintenance can be derived. Moreover, the trade-off between the cost of maintenance and the increase in power consumption due to the wear state of the monitored components of the machine is investigated. This work proves the benefits of real-time monitoring system for the efficient operation of industrial shredders.

Keywords: predictive maintenance, circular economy, industrial shredder, cost optimization, statistical machine learning

Procedia PDF Downloads 120
1393 Stability of Novel Peptides (Linusorbs) in Flaxseed Meal Fortified Gluten-Free Bread

Authors: Youn Young Shim, Martin J. T. Reaney

Abstract:

Flaxseed meal is rich in water-soluble gums and, as such, can improve texture in gluten-free products. Flaxseed bioactive-antioxidant peptides, linusorbs (LOs, a.k.a. cyclolinopeptides), are a class of molecules that may contribute health-promoting effects. The effects of dough preparation, baking, and storage on flaxseed-derived LOs stability in doughs and baked products are un-known. Gluten-free (GF) bread dough and bread were prepared with flaxseed meal and the LO content was determined in the flaxseed meal, bread flour containing the flaxseed meal, bread dough, and bread. The LO contents during storage (0, 1, 2, and 4 weeks) at different temperatures (−18 °C, 4 °C, and 22−23 °C) were determined by high-performance liquid chromatog-raphy-diode array detection (HPLC-DAD). The content of oxidized LOs like [1–9-NαC],[1(Rs, Ss)-MetO]-linusorb B2 (LO14) were substantially constant in flaxseed meal and flour produced from flaxseed meal under all conditions for up to 4 weeks. However, during GF-bread production LOs decreased. Due to microbial contamination dough could not be stored at either 4 or 21°C, and bread could only be stored for one week at 21°C. Up to 4 weeks storage was possible for bread and dough at −18 °C and bread at 4 °C without the loss of LOs. The LOs change mostly from processing and less so from storage. The concentration of reduced LOs in flour and meal were much higher than measured in dough and bread. There was not a corre-sponding increase in oxidized LOs. The LOs in flaxseed meal-fortified bread were stable for products stored at low temperatures. This study is the first of the impact of baking conditions on LO content and quality.

Keywords: flaxseed, stability, gluten-free, antioxidant

Procedia PDF Downloads 86
1392 Natural Fibre Composite Structural Sections for Residential Stud Wall Applications

Authors: Mike R. Bambach

Abstract:

Increasing awareness of environmental concerns is leading a drive towards more sustainable structural products for the built environment. Natural fibres such as flax, jute and hemp have recently been considered for fibre-resin composites, with a major motivation for their implementation being their notable sustainability attributes. While recent decades have seen substantial interest in the use of such natural fibres in composite materials, much of this research has focused on the materials aspects, including fibre processing techniques, composite fabrication methodologies, matrix materials and their effects on the mechanical properties. The present study experimentally investigates the compression strength of structural channel sections of flax, jute and hemp, with a particular focus on their suitability for residential stud wall applications. The section geometry is optimised for maximum strength via the introduction of complex stiffeners in the webs and flanges. Experimental results on both natural fibre composite channel sections and typical steel and timber residential wall studs are compared. The geometrically optimised natural fibre composite channels are shown to have compression capacities suitable for residential wall stud applications, identifying them as a potentially viable alternative to traditional building materials in such application, and potentially other light structural applications.

Keywords: channel sections, natural fibre composites, residential stud walls, structural composites

Procedia PDF Downloads 313
1391 Preparation and Characterization of Iron/Titanium-Pillared Clays

Authors: Rezala Houria, Valverde Jose Luis, Romero Amaya, Molinari Alessandra, Maldotti Andrea

Abstract:

The escalation of oil prices in 1973 confronted the oil industry with the problem of how to maximize the processing of crude oil, especially the heavy fractions, to give gasoline components. Strong impetus was thus given to the development of catalysts with relatively large pore sizes, which were able to deal with larger molecules than the existing molecular sieves, and with good thermal and hydrothermal stability. The oil embargo in 1973 therefore acted as a stimulus for the investigation and development of pillared clays. Iron doped titania-pillared montmorillonite clays was prepared using bentonite from deposits of Maghnia in western-Algeria. The preparation method consists of differents steps (purification of the raw bentonite, preparation of a pillaring agent solution and exchange of the cations located between the clay layers with the previously formed iron/titanium solution). The characterization of this material was carried out by X-ray fluorescence spectrometry, X-ray diffraction, textural measures by BET method, inductively coupled plasma atomic emission spectroscopy, diffuse reflectance UV visible spectroscopy, temperature- programmed desorption of ammonia and atomic absorption.This new material was investigated as photocatalyst for selective oxygenation of the liquid alkylaromatics such as: toluene, paraxylene and orthoxylene and the photocatalytic properties of it were compared with those of the titanium-pillared clays.

Keywords: iron doping, montmorillonite clays, pillared clays, oil industry

Procedia PDF Downloads 302
1390 A Verification Intellectual Property for Multi-Flow Rate Control on Any Single Flow Bus Functional Model

Authors: Pawamana Ramachandra, Jitesh Gupta, Saranga P. Pogula

Abstract:

In verification of high volume and complex packet processing IPs, finer control of flow management aspects (for example, rate, bits/sec etc.) per flow class (or a virtual channel or a software thread) is needed. When any Software/Universal Verification Methodology (UVM) thread arbitration is left to the simulator (e.g., Verilog Compiler Simulator (VCS) or Incisive Enterprise Simulator core simulation engine (NCSIM)), it is hard to predict its pattern of resulting distribution of bandwidth by the simulator thread arbitration. In many cases, the patterns desired in a test scenario may not be accomplished as the simulator might give a different distribution than what was required. This can lead to missing multiple traffic scenarios, specifically deadlock and starvation related. We invented a component (namely Flow Manager Verification IP) to be intervening between the application (test case) and the protocol VIP (with UVM sequencer) to control the bandwidth per thread/virtual channel/flow. The Flow Manager has knobs visible to the UVM sequence/test to configure the required distribution of rate per thread/virtual channel/flow. This works seamlessly and produces rate stimuli to further harness the Design Under Test (DUT) with asymmetric inputs compared to the programmed bandwidth/Quality of Service (QoS) distributions in the Design Under Test.

Keywords: flow manager, UVM sequencer, rated traffic generation, quality of service

Procedia PDF Downloads 98
1389 Process Optimization for Albanian Crude Oil Characterization

Authors: Xhaklina Cani, Ilirjan Malollari, Ismet Beqiraj, Lorina Lici

Abstract:

Oil characterization is an essential step in the design, simulation, and optimization of refining facilities. To achieve optimal crude selection and processing decisions, a refiner must have exact information refer to crude oil quality. This includes crude oil TBP-curve as the main data for correct operation of refinery crude oil atmospheric distillation plants. Crude oil is typically characterized based on a distillation assay. This procedure is reasonably well-defined and is based on the representation of the mixture of actual components that boil within a boiling point interval by hypothetical components that boil at the average boiling temperature of the interval. The crude oil assay typically includes TBP distillation according to ASTM D-2892, which can characterize this part of oil that boils up to 400 C atmospheric equivalent boiling point. To model the yield curves obtained by physical distillation is necessary to compare the differences between the modelling and the experimental data. Most commercial use a different number of components and pseudo-components to represent crude oil. Laboratory tests include distillations, vapor pressures, flash points, pour points, cetane numbers, octane numbers, densities, and viscosities. The aim of the study is the drawing of true boiling curves for different crude oil resources in Albania and to compare the differences between the modeling and the experimental data for optimal characterization of crude oil.

Keywords: TBP distillation curves, crude oil, optimization, simulation

Procedia PDF Downloads 302
1388 Implementation Association Rule Method in Determining the Layout of Qita Supermarket as a Strategy in the Competitive Retail Industry in Indonesia

Authors: Dwipa Rizki Utama, Hanief Ibrahim

Abstract:

The development of industry retail in Indonesia is very fast, various strategy was undertaken to boost the customer satisfaction and the productivity purchases to boost the profit, one of which is implementing strategies layout. The purpose of this study is to determine the layout of Qita supermarket, a retail industry in Indonesia, in order to improve customer satisfaction and to maximize the rate of products’ sale as a whole, so as the infrequently purchased products will be purchased. This research uses a literature study method, and one of the data mining methods is association rule which applied in market basket analysis. Data were tested amounted 100 from 160 after pre-processing data, so then the distribution department and 26 departments corresponding to the data previous layout will be obtained. From those data, by the association rule method, customer behavior when purchasing items simultaneously can be studied, so then the layout of the supermarket based on customer behavior can be determined. Using the rapid miner software by the minimal support 25% and minimal confidence 30% showed that the 14th department purchased at the same time with department 10, 21st department purchased at the same time with department 13, 15th department purchased at the same time with department 12, 14th department purchased at the same time with department 12, and 10th department purchased at the same time with department 14. From those results, a better supermarket layout can be arranged than the previous layout.

Keywords: industry retail, strategy, association rule, supermarket

Procedia PDF Downloads 187
1387 Design of a Photovoltaic Power Generation System Based on Artificial Intelligence and Internet of Things

Authors: Wei Hu, Wenguang Chen, Chong Dong

Abstract:

In order to improve the efficiency and safety of photovoltaic power generation devices, this photovoltaic power generation system combines Artificial Intelligence (AI) and the Internet of Things (IoT) to control the chasing photovoltaic power generation device to track the sun to improve power generation efficiency and then convert energy management. The system uses artificial intelligence as the control terminal, the power generation device executive end uses the Linux system, and Exynos4412 is the CPU. The power generating device collects the sun image information through Sony CCD. After several power generating devices feedback the data to the CPU for processing, several CPUs send the data to the artificial intelligence control terminal through the Internet. The control terminal integrates the executive terminal information, time information, and environmental information to decide whether to generate electricity normally and then whether to convert the converted electrical energy into the grid or store it in the battery pack. When the power generation environment is abnormal, the control terminal authorizes the protection strategy, the power generation device executive terminal stops power generation and enters a self-protection posture, and at the same time, the control terminal synchronizes the data with the cloud. At the same time, the system is more intelligent, more adaptive, and longer life.

Keywords: photo-voltaic power generation, the pursuit of light, artificial intelligence, internet of things, photovoltaic array, power management

Procedia PDF Downloads 123
1386 A Transformer-Based Question Answering Framework for Software Contract Risk Assessment

Authors: Qisheng Hu, Jianglei Han, Yue Yang, My Hoa Ha

Abstract:

When a company is considering purchasing software for commercial use, contract risk assessment is critical to identify risks to mitigate the potential adverse business impact, e.g., security, financial and regulatory risks. Contract risk assessment requires reviewers with specialized knowledge and time to evaluate the legal documents manually. Specifically, validating contracts for a software vendor requires the following steps: manual screening, interpreting legal documents, and extracting risk-prone segments. To automate the process, we proposed a framework to assist legal contract document risk identification, leveraging pre-trained deep learning models and natural language processing techniques. Given a set of pre-defined risk evaluation problems, our framework utilizes the pre-trained transformer-based models for question-answering to identify risk-prone sections in a contract. Furthermore, the question-answering model encodes the concatenated question-contract text and predicts the start and end position for clause extraction. Due to the limited labelled dataset for training, we leveraged transfer learning by fine-tuning the models with the CUAD dataset to enhance the model. On a dataset comprising 287 contract documents and 2000 labelled samples, our best model achieved an F1 score of 0.687.

Keywords: contract risk assessment, NLP, transfer learning, question answering

Procedia PDF Downloads 128
1385 Continuous Measurement of Spatial Exposure Based on Visual Perception in Three-Dimensional Space

Authors: Nanjiang Chen

Abstract:

In the backdrop of expanding urban landscapes, accurately assessing spatial openness is critical. Traditional visibility analysis methods grapple with discretization errors and inefficiencies, creating a gap in truly capturing the human experi-ence of space. Addressing these gaps, this paper introduces a distinct continuous visibility algorithm, a leap in measuring urban spaces from a human-centric per-spective. This study presents a methodological breakthrough by applying this algorithm to urban visibility analysis. Unlike conventional approaches, this tech-nique allows for a continuous range of visibility assessment, closely mirroring hu-man visual perception. By eliminating the need for predefined subdivisions in ray casting, it offers a more accurate and efficient tool for urban planners and architects. The proposed algorithm not only reduces computational errors but also demonstrates faster processing capabilities, validated through a case study in Bei-jing's urban setting. Its key distinction lies in its potential to benefit a broad spec-trum of stakeholders, ranging from urban developers to public policymakers, aid-ing in the creation of urban spaces that prioritize visual openness and quality of life. This advancement in urban analysis methods could lead to more inclusive, comfortable, and well-integrated urban environments, enhancing the spatial experience for communities worldwide.

Keywords: visual openness, spatial continuity, ray-tracing algorithms, urban computation

Procedia PDF Downloads 45
1384 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes

Authors: Sky Chou, Joseph C. Chen

Abstract:

This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.

Keywords: CNC machining, six sigma, surface roughness, Taguchi methodology

Procedia PDF Downloads 242
1383 Assessment of Microbiological Feed Safety from Serbian Market from 2013 to 2017

Authors: Danijela Vuković, Radovan Čobanović, Milorad Plačkić

Abstract:

The expansion of population imposes increase in usage of animal meat, on whose quality directly affects the quality of the feed that the animals are fed with. The selection of raw materials, hygiene during the technological process, various hydrothermal treatments, methods of mixing etc. have an influence on the quality of feed. Monitoring of the feed is very important to obtain information about the quality of feed and the possible prevention of animal diseases which can lead to different human diseases outbreaks. In this study parameters of feed safety were monitored. According to the mentioned, the goal of this study was to evaluate microbiological safety of feed (feedstuffs and complete mixtures). Total number of analyzed samples was 4399. Analyzed feed samples were collected in various retail shops and feed factories during the period of 44 months (from January 2013 untill September 2017). Samples were analyzed on Salmonella spp. and Clostridium perfringens in quantity of 50g according to Serbian regulation. All microorganisms were tested according to ISO methodology: Salmonella spp. ISO 6579:2002 and Clostridium perfringens ISO 7937:2004. Out of 4399 analyzed feed samples 97,5% were satisfactory and 2,5% unsatisfactory concerning Salmonella spp. As far as Clostridium perfringens is concerned 100% of analyzed samples were satisfactory. The obtained results suggest that technological processing of feed in Serbia is at high level when it comes to safety and hygiene of the products, but there are still possibilities for progress and improvement which only can be reached trough the permanent monitoring of feed.

Keywords: microbiology, safety, hygiene, feed

Procedia PDF Downloads 302
1382 Green Technologies Developed by JSC “NIUIF”

Authors: Andrey Norov

Abstract:

In the recent years, Samoilov Research Institute for Mineral Fertilizers JSC “NIUIF”, the oldest (established in September 1919) industry-oriented institute in Russia, has developed a range of sustainable, environment-friendly, zero-waste technologies that ensure minimal consumption of materials and energy resources and fully consistent with the principles of Green Chemistry that include: - Ecofriendly energy and resource saving technology of sulfuric acid from sulfur according to DC-DA scheme (double conversion - double absorption); - Improved zero-waste technology of wet phosphoric acid (WPA) by dihydrate-hemihydrate process applicable to various types of phosphate raw materials; - Flexible, efficient, zero-waste, universal technology of NP / NPS / NPK / NPKS fertilizers with maximum heat recovery from chemical processes; - Novel, zero-waste, no-analogue technology of granular PK / PKS / NPKS fertilizers with controlled dissolution rate and nutrient supply into the soil, which allows to process a number of wastes and by-products; - Innovative resource-saving joint processing of wastes from the production of phosphogypsum and fluorosilicic acid (FSA) into ammonium sulfate with simultaneous neutralization of fluoride compounds with no lime used. - New fertilizer technology of increased environmental and agrochemical efficiency (currently under development). All listed green technologies are patented with Russian and Eurasian patents. The development of ecofriendly, safe, green technologies is ongoing in JSC “NIUIF”.

Keywords: NPKS fertilizers, FSA, sulfuric acid, WPA

Procedia PDF Downloads 93
1381 Case Study of High-Resolution Marine Seismic Survey in Shallow Water, Arabian Gulf, Saudi Arabia

Authors: Almalki M., Alajmi M., Qadrouh Y., Alzahrani E., Sulaiman A., Aleid M., Albaiji A., Alfaifi H., Alhadadi A., Almotairy H., Alrasheed R., Alhafedh Y.

Abstract:

High-resolution marine seismic survey is a well-established technique that commonly used to characterize near-surface sediments and geological structures at shallow water. We conduct single channel seismic survey to provide high quality seismic images for near-surface sediments upto 100m depth at Jubal costal area, Arabian Gulf. Eight hydrophones streamer has been used to collect stacked seismic traces alone 5km seismic line. To reach the required depth, we have used spark system that discharges energies above 5000 J with expected frequency output span the range from 200 to 2000 Hz. A suitable processing flow implemented to enhance the signal-to-noise ratio of the seismic profile. We have found that shallow sedimentary layers at the study site have complex pattern of reflectivity, which decay significantly due to amount of source energy used as well as the multiples associated to seafloor. In fact, the results reveal that single channel marine seismic at shallow water is a cost-effective technique that can be easily repeated to observe any possibly changes in the wave physical properties at the near surface layers

Keywords: shallow marine single-channel data, high resolution, frequency filtering, shallow water

Procedia PDF Downloads 70
1380 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis

Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen

Abstract:

Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.

Keywords: hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection

Procedia PDF Downloads 305
1379 Cloud Shield: Model to Secure User Data While Using Content Delivery Network Services

Authors: Rachna Jain, Sushila Madan, Bindu Garg

Abstract:

Cloud computing is the key powerhouse in numerous organizations due to shifting of their data to the cloud environment. In recent years it has been observed that cloud-based-services are being used on large scale for content storage, distribution and processing. Various issues have been observed in cloud computing environment that need to be addressed. Security and privacy are found topmost concern area. In this paper, a novel security model is proposed to secure data by utilizing CDN services like image to icon conversion. CDN Service is a content delivery service which converts an image to icon, word to pdf & Latex to pdf etc. Presented model is used to convert an image into icon by keeping image secret. Here security of image is imparted so that image should be encrypted and decrypted by data owners only. It is also discussed in the paper that how server performs multiplication and selection on encrypted data without decryption. The data can be image file, word file, audio or video file. Moreover, the proposed model is capable enough to multiply images, encrypt them and send to a server application for conversion. Eventually, the prime objective is to encrypt an image and convert the encrypted image to image Icon by utilizing homomorphic encryption.

Keywords: cloud computing, user data security, homomorphic encryption, image multiplication, CDN service

Procedia PDF Downloads 332