Search results for: 2d and 3d data conversion
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26331

Search results for: 2d and 3d data conversion

24051 Efficient L-Xylulose Production Using Whole-Cell Biocatalyst With NAD+ Regeneration System Through Co-Expression of Xylitol Dehydrogenase and NADH Oxidase in Escherichia Coli

Authors: Mesfin Angaw Tesfay

Abstract:

L-Xylulose is a potentially valuable rare sugar used as starting material for antiviral and anticancer drug development in pharmaceutical industries. L-Xylulose exist in a very low concentration in nature and have to be synthesized from cheap starting materials such as xylitol through biotechnological approaches. In this study, cofactor engineering and deep eutectic solvent were applied to improve the efficiency of L-xylulose production from xylitol. A water-forming NAD+ regeneration enzyme (NADH oxidase) from Streptococcus mutans ATCC 25175 was introduced into E. coli with xylitol-4-dehydrogenase (XDH) of Pantoea ananatis resulting in recombinant cells harboring the vector pETDuet-xdh-SmNox. Further, three deep eutectic solvents (DES) including, Choline chloride/glycerol (ChCl/G), Choline chloride/urea (ChCl/U), and Choline chloride/ethylene glycol (ChCl/EG) have been employed to facilitate the conversion efficiency of L-xylulose from xylitol. The co-expression system exhibited optimal activity at a temperature of 37 ℃ and pH 8.5, and the addition of Mg2+ enhanced the catalytic activity by 1.19-fold. Co-expression of NADH oxidase with XDH enzyme resulted in increased L-xylulose concentration and productivity from xylitol as well as the intracellular NAD+ concentration. Two of the DES used (ChCl/U and ChCl/EG) show positive effects on product yield and the ChCl/G has inhibiting effects. The optimum concentration of ChCl/U was 2.5%, which increased the L-xylulose yields compared to the control without DES. In a 1 L fermenter the final concentration and productivity of L-xylulose from 50 g/L of xylitol reached 48.45 g/L, and 2.42 g/L.h respectively, which was the highest report. Overall, this study is a suitable approach for large-scale production of L-xylulose from xylitol using the engineered E. coli cell.

Keywords: Xylitol-4-dehydrogenase, NADH oxidase, L-xylulose, Xylitol, Coexpression, DESs

Procedia PDF Downloads 30
24050 The Role of Fluid Catalytic Cracking in Process Optimisation for Petroleum Refineries

Authors: Chinwendu R. Nnabalu, Gioia Falcone, Imma Bortone

Abstract:

Petroleum refining is a chemical process in which the raw material (crude oil) is converted to finished commercial products for end users. The fluid catalytic cracking (FCC) unit is a key asset in refineries, requiring optimised processes in the context of engineering design. Following the first stage of separation of crude oil in a distillation tower, an additional 40 per cent quantity is attainable in the gasoline pool with further conversion of the downgraded product of crude oil (residue from the distillation tower) using a catalyst in the FCC process. Effective removal of sulphur oxides, nitrogen oxides, carbon and heavy metals from FCC gasoline requires greater separation efficiency and involves an enormous environmental significance. The FCC unit is primarily a reactor and regeneration system which employs cyclone systems for separation.  Catalyst losses in FCC cyclones lead to high particulate matter emission on the regenerator side and fines carryover into the product on the reactor side. This paper aims at demonstrating the importance of FCC unit design criteria in terms of technical performance and compliance with environmental legislation. A systematic review of state-of-the-art FCC technology was carried out, identifying its key technical challenges and sources of emissions.  Case studies of petroleum refineries in Nigeria were assessed against selected global case studies. The review highlights the need for further modelling investigations to help improve FCC design to more effectively meet product specification requirements while complying with stricter environmental legislation.

Keywords: design, emission, fluid catalytic cracking, petroleum refineries

Procedia PDF Downloads 139
24049 Transport Medium That Prevents the Conversion of Helicobacter Pylori to the Coccoid Form

Authors: Eldar Mammadov, Konul Mammadova, Aytaj Ilyaszada

Abstract:

Background: According to many studies, it is known that H. pylori transform into the coccoid form, which cannot be cultured and has poor metabolic activity.In this study, we succeeded in preserving the spiral shape of H.pylori for a long time by preparing a biphase transport medium with a hard bottom (Muller Hinton with 7% HRBC (horse red blood cells) agar 5ml) and liquid top part (BH (brain heart) broth + HS (horse serum)+7% HRBC+antibiotics (Vancomycin 5 mg, Trimethoprim lactate 25 mg, Polymyxin B 1250 I.U.)) in cell culture flasks with filter caps. For comparison, we also used a BH broth medium with 7% HRBC used for the transport of H.pylori. Methods: Rapid urease test positive 7 biopsy specimens were also inoculated into biphasic and BH broth medium with 7% HRBC, then put in CO2 Gaspak packages and sent to the laboratory. Then both mediums were kept in the thermostat at 37 °C for 1 day. After microscopic, PCR and urease test diagnosis, they were transferred to Columbia Agar with 7% HRBC. Incubated at 37°C for 5-7 days, cultures were examined for colony characteristics and bacterial morphology. E-test antimicrobial susceptibility test was performed. Results: There were 3 growths from biphasic transport medium passed to Columbia agar with 7% HRBC and only 1 growth from BH broth medium with 7% HRBC. It was also observed that after the first 3 days in BH broth medium with 7%, H.pylori passed into coccoid form and its biochemical activity weakened, while its spiral shape did not change for 2-3 weeks in the biphase transport medium. Conclusions: By using the biphase transport medium we have prepared; we can culture the bacterium by preventing H.pylori from spiraling into the coccoid form. In our opinion, this may result in the wide use of culture method for diagnosis of H.pylori, study of antibiotic susceptibility and molecular genetic analysis.

Keywords: clinical trial, H.pylori, coccoid form, transport medium

Procedia PDF Downloads 76
24048 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things

Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker

Abstract:

Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.

Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data

Procedia PDF Downloads 338
24047 Effective Energy Saving of a Large Building through Multiple Approaches

Authors: Choo Hong Ang

Abstract:

The most popular approach to save energy for large commercial buildings in Malaysia is to replace the existing chiller plant of high kW/ton to one of lower kW/ton. This approach, however, entails large capital outlay with a long payment period of up to 7 years. This paper shows that by using multiple approaches, other than replacing the existing chiller plant, an energy saving of up to 20 %, is possible. The main methodology adopted was to identify and then plugged all heat ingress paths into a building, including putting up glass structures to prevent mixing of internal air-conditioned air with the ambient environment, and replacing air curtains with glass doors. This methodology could save up to 10 % energy bill. Another methodology was to change fixed speed motors of air handling units (AHU) to variable speed drive (VSD) and changing escalators to motion-sensor type. Other methodologies included reducing heat load by blocking air supply to non-occupied parcels, rescheduling chiller plant operation, changing of fluorescent lights to LED lights, and conversion from tariff B to C1. A case example of Komtar, the tallest building in Penang, is given here. The total energy bill for Komtar was USD2,303,341 in 2016 but was reduced to USD 1,842,927.39 in 2018, a significant saving of USD460,413.86 or 20 %. In terms of kWh, there was a reduction from 18, 302,204.00 kWh in 2016 to 14,877,105.00 kWh in 2018, a reduction of 3,425,099.00 kWh or 18.71 %. These methodologies used were relatively low cost and the payback period was merely 24 months. With this achievement, the Komtar building was awarded champion of the Malaysian National Energy Award 2019 and second runner up of the Asean Energy Award. This experience shows that a strong commitment to energy saving is the key to effective energy saving.

Keywords: chiller plant, energy saving measures, heat ingress, large building

Procedia PDF Downloads 109
24046 Application of KL Divergence for Estimation of Each Metabolic Pathway Genes

Authors: Shohei Maruyama, Yasuo Matsuyama, Sachiyo Aburatani

Abstract:

The development of the method to annotate unknown gene functions is an important task in bioinformatics. One of the approaches for the annotation is The identification of the metabolic pathway that genes are involved in. Gene expression data have been utilized for the identification, since gene expression data reflect various intracellular phenomena. However, it has been difficult to estimate the gene function with high accuracy. It is considered that the low accuracy of the estimation is caused by the difficulty of accurately measuring a gene expression. Even though they are measured under the same condition, the gene expressions will vary usually. In this study, we proposed a feature extraction method focusing on the variability of gene expressions to estimate the genes' metabolic pathway accurately. First, we estimated the distribution of each gene expression from replicate data. Next, we calculated the similarity between all gene pairs by KL divergence, which is a method for calculating the similarity between distributions. Finally, we utilized the similarity vectors as feature vectors and trained the multiclass SVM for identifying the genes' metabolic pathway. To evaluate our developed method, we applied the method to budding yeast and trained the multiclass SVM for identifying the seven metabolic pathways. As a result, the accuracy that calculated by our developed method was higher than the one that calculated from the raw gene expression data. Thus, our developed method combined with KL divergence is useful for identifying the genes' metabolic pathway.

Keywords: metabolic pathways, gene expression data, microarray, Kullback–Leibler divergence, KL divergence, support vector machines, SVM, machine learning

Procedia PDF Downloads 406
24045 Effective Water Purification by Impregnated Carbon Nanotubes

Authors: Raviteja Chintala

Abstract:

Water shortage in many areas of the world have predominantly increased the demand for efficient methods involved in the production of drinking water, So purification of water invoking cost effective and efficient methods is a challenging field of research. In this regard, Reverse osmosis membrane desalination of both seawater and inland brackish water is currently being deployed in various locations around the world. In the present work an attempt is made to integrate these existing technologies with novel method, Wherein carbon nanotubes at the lab scale are prepared which further replace activated carbon tubes being used traditionally. This has proven to enhance the efficiency of the water filter, Effectively neutralising most of the organic impurities. Furthermore, This ensures the reduction in TDS. Carbon nanotubes have wide range in scope of applications such as composite reinforcements, Field emitters, Sensors, Energy storage and energy conversion devices and catalysts support phases, Because of their unusual mechanical, Electrical, Thermal and structural properties. In particular, The large specific surface area, as well as the high chemical and thermal stability, Makes carbon nanotube an attractive adsorbent in waste water treatment. Carbon nanotubes are effective in eliminating these harmful media from water as an adsorbent. In this work, Candle soot method has been incorporated for the preparation of carbon nanotubes and mixed with activated charcoal in different compositions. The effect of composition change is monitored by using TDS measuring meter. As the composition of Nano carbon increases, The TDS of the water gradually decreases. In order to enhance the life time for carbon filter, Nano tubes are provided with larger surface area.

Keywords: TDS (Total Dissolved Solids), carbon nanotubes, water, candle soot

Procedia PDF Downloads 344
24044 Impact of Instagram Food Bloggers on Consumer (Generation Z) Decision Making Process in Islamabad. Pakistan

Authors: Tabinda Sadiq, Tehmina Ashfaq Qazi, Hoor Shumail

Abstract:

Recently, the advent of emerging technology has created an emerging generation of restaurant marketing. It explores the aspects that influence customers’ decision-making process in selecting a restaurant after reading food bloggers' reviews online. The motivation behind this research is to investigate the correlation between the credibility of the source and their attitude toward restaurant visits. The researcher collected the data by distributing a survey questionnaire through google forms by employing the Source credibility theory. Non- probability purposive sampling technique was used to collect data. The questionnaire used a predeveloped and validated scale by Ohanian to measure the relationship. Also, the researcher collected data from 250 respondents in order to investigate the influence of food bloggers on Gen Z's decision-making process. SPSS statistical version 26 was used for statistical testing and analyzing the data. The findings of the survey revealed that there is a moderate positive correlation between the variables. So, it can be analyzed that food bloggers do have an impact on Generation Z's decision making process.

Keywords: credibility, decision making, food bloggers, generation z, e-wom

Procedia PDF Downloads 78
24043 Performance Measurement of Logistics Systems for Thailand's Wholesales and Retails Industries by Data Envelopment Analysis

Authors: Pornpimol Chaiwuttisak

Abstract:

The study aims to compare the performance of the logistics for Thailand’s wholesale and retail trade industries (except motor vehicles, motorcycle, and stalls) by using data (data envelopment analysis). Thailand Standard Industrial Classification in 2009 (TSIC - 2009) categories that industries into sub-group no. 45: wholesale and retail trade (except for the repair of motor vehicles and motorcycles), sub-group no. 46: wholesale trade (except motor vehicles and motorcycles), and sub-group no. 47: retail trade (except motor vehicles and motorcycles. Data used in the study is collected by the National Statistical Office, Thailand. The study consisted of four input factors include the number of companies, the number of personnel in logistics, the training cost in logistics, and outsourcing logistics management. Output factor includes the percentage of enterprises having inventory management. The results showed that the average relative efficiency of small-sized enterprises equals to 27.87 percent and 49.68 percent for the medium-sized enterprises.

Keywords: DEA, wholesales and retails, logistics, Thailand

Procedia PDF Downloads 419
24042 Event Data Representation Based on Time Stamp for Pedestrian Detection

Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita

Abstract:

In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.

Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption

Procedia PDF Downloads 104
24041 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea

Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi

Abstract:

Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.

Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow

Procedia PDF Downloads 127
24040 Application of Observational Medical Outcomes Partnership-Common Data Model (OMOP-CDM) Database in Nursing Health Problems with Prostate Cancer-a Pilot Study

Authors: Hung Lin-Zin, Lai Mei-Yen

Abstract:

Prostate cancer is the most commonly diagnosed male cancer in the U.S. The prevalence is around 1 in 8. The etiology of prostate cancer is still unknown, but some predisposing factors, such as age, black race, family history, and obesity, may increase the risk of the disease. In 2020, a total of 7,178 Taiwanese people were nearly diagnosed with prostate cancer, accounting for 5.88% of all cancer cases, and the incidence rate ranked fifth among men. In that year, the total number of deaths from prostate cancer was 1,730, accounting for 3.45% of all cancer deaths, and the death rate ranked 6th among men, accounting for 94.34% of the cases of male reproductive organs. Looking for domestic and foreign literature on the use of OMOP (Observational Medical Outcomes Partnership, hereinafter referred to as OMOP) database analysis, there are currently nearly a hundred literature published related to nursing-related health problems and nursing measures built in the OMOP general data model database of medical institutions are extremely rare. The OMOP common data model construction analysis platform is a system developed by the FDA in 2007, using a common data model (common data model, CDM) to analyze and monitor healthcare data. It is important to build up relevant nursing information from the OMOP- CDM database to assist our daily practice. Therefore, we choose prostate cancer patients who are our popular care objects and use the OMOP- CDM database to explore the common associated health problems. With the assistance of OMOP-CDM database analysis, we can expect early diagnosis and prevention of prostate cancer patients' comorbidities to improve patient care.

Keywords: OMOP, nursing diagnosis, health problem, prostate cancer

Procedia PDF Downloads 76
24039 Investigation of Learning Challenges in Building Measurement Unit

Authors: Argaw T. Gurmu, Muhammad N. Mahmood

Abstract:

The objective of this research is to identify the architecture and construction management students’ learning challenges of the building measurement. This research used the survey data obtained collected from the students who completed the building measurement unit. NVivo qualitative data analysis software was used to identify relevant themes. The analysis of the qualitative data revealed the major learning difficulties such as inadequacy of practice questions for the examination, inability to work as a team, lack of detailed understanding of the prerequisite units, insufficiency of the time allocated for tutorials and incompatibility of lecture and tutorial schedules. The output of this research can be used as a basis for improving the teaching and learning activities in construction measurement units.

Keywords: building measurement, construction management, learning challenges, evaluate survey

Procedia PDF Downloads 144
24038 Using Data-Driven Model on Online Customer Journey

Authors: Ing-Jen Hung, Tzu-Chien Wang

Abstract:

Nowadays, customers can interact with firms through miscellaneous online ads on different channels easily. In other words, customer now has innumerable options and limitless time to accomplish their commercial activities with firms, individualizing their own online customer journey. This kind of convenience emphasizes the importance of online advertisement allocation on different channels. Therefore, profound understanding of customer behavior can make considerable benefit from optimizing fund allocation on diverse ad channels. To achieve this objective, multiple firms utilize numerical methodology to create data-driven advertisement policy. In our research, we aim to exploit online customer click data to discover the correlations between each channel and their sequential relations. We use LSTM to deal with sequential property of our data and compare its accuracy with other non-sequential methods, such as CART decision tree, logistic regression, etc. Besides, we also classify our customers into several groups by their behavioral characteristics to perceive the differences between all groups as customer portrait. As a result, we discover distinct customer journey under each customer portrait. Our article provides some insights into marketing research and can help firm to formulate online advertising criteria.

Keywords: LSTM, customer journey, marketing, channel ads

Procedia PDF Downloads 124
24037 Use of Corn Stover for the Production of 2G Bioethanol, Enzymes, and Xylitol Under a Biorefinery Concept

Authors: Astorga-Trejo Rebeca, Fonseca-Peralta Héctor Manuel, Beltrán-Arredondo Laura Ivonne, Castro-Martínez Claudia

Abstract:

The use of biomass as feedstock for the production of fuels and other chemicals of interest is an ever-growing accepted option in the way to the development of biorefinery complexes; in the Mexican state of Sinaloa, two million tons of residues from corn crops are produced every year, most of which can be converted to bioethanol and other products through biotechnological conversion using yeast and other microorganisms. Therefore, the objective of this work was to take advantage of corn stover and evaluate its potential as a substrate for the production of second-generation bioethanol (2G), enzymes, and xylitol. To produce bioethanol 2G, an acid-alkaline pretreatment was carried out prior to saccharification and fermentation. The microorganisms used for the production of enzymes, as well as for the production of xylitol, were isolated and characterized in our workgroup. Statistical analysis was performed using Design Expert version 11.0. The results showed that it is possible to obtain 2G bioethanol employing corn stover as a carbon source and Saccharomyces cerevisiae ItVer01 and Candida intermedia CBE002 with yields of 0.42 g and 0.31 g, respectively. It was also shown that C. intermedia has the ability to produce xylitol with a good yield (0.46 g/g). On the other hand, qualitative and quantitative studies showed that the native strains of Fusarium equiseti (0.4 IU/mL - xylanase), Bacillus velezensis (1.2 IU/mL – xylanase and 0.4 UI/mL - amylase) and Penicillium funiculosum (1.5 IU / mL - cellulases) have the capacity to produce xylanases, amylases or cellulases using corn stover as raw material. This study allowed us to demonstrate that it is possible to use corn stover as a carbon source, a low-cost raw material with high availability in our country, to obtain bioproducts of industrial interest, using processes that are more environmentally friendly and sustainable. It is necessary to continue the optimization of each bioprocess.

Keywords: biomass, corn stover, biorefinery, bioethanol 2G, enzymes, xylitol

Procedia PDF Downloads 176
24036 Syntheses of Biobased Hybrid Poly(epoxy-hydroxyurethane) Polymers

Authors: Adrien Cornille, Sylvain Caillol, Bernard Boutevon

Abstract:

The development of polyurethanes began in 1937 at I. G. Farbenindustrie where Bayer with coworkers discovered the addition polymerization reaction between diisocyanates and diols. Since their discovery, the demand in PU has continued to increase and it will attain in 2016 a production of 18 million tons. However, isocyanates compounds are harmful to human and environment. Methylene diphenyl 4,4’-diisocyanate (MDI) and toluene diisocyanate (TDI), the most widely used isocyanates in PU industry, are classified as CMR (Carcinogen, Mutagen, and Reprotoxic). In order to design isocyanate-free materials, an interesting alternative is the use of Polyhydroxyurethanes (PHUs) by reaction between cyclic carbonate and polyfunctional amines. The main problem concerning PHUs synthesis relates to the low reactivity of carbonate/amine reaction. To solve this issue, many studies in the literature have been conducted to design PHU from more reactive cyclic-carbonates, bearing electro-withdrawing substituent or by using six-membered, seven-membered or thio-cyclic carbonate. The main drawback of all these systems remains the low molar masses obtained for the synthesized PHUs, which hinders their use for material applications. Therefore, we developed another strategy to afford new hybrid PHU with high conversion. This very innovative two-step approach consists in the first step in the synthesis of aminotelechelic PHU oligomers with different chain length from bis-cyclic carbonate with different excess of primary amine functions. In the second step, these aminotelechelic PHU oligomers were used in formulation with biobased epoxy monomers (from cashew nut shell liquid and tannins) to synthesize hybrid polyepoxyurethane polymers. These materials were then characterized by thermal and mechanical analyses.

Keywords: polyurethane, polyhydroxyurethane, aminotelechelic NIPU oligomers, carbonates, epoxy, amine, epoxyurethane polymers, hybrid polymers

Procedia PDF Downloads 218
24035 A Secure Proxy Signature Scheme with Fault Tolerance Based on RSA System

Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi

Abstract:

Due to the rapid growth in modern communication systems, fault tolerance and data security are two important issues in a secure transaction. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a secure proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.

Keywords: proxy signature, fault tolerance, rsa, key agreement protocol

Procedia PDF Downloads 288
24034 Estimating the Receiver Operating Characteristic Curve from Clustered Data and Case-Control Studies

Authors: Yalda Zarnegarnia, Shari Messinger

Abstract:

Receiver operating characteristic (ROC) curves have been widely used in medical research to illustrate the performance of the biomarker in correctly distinguishing the diseased and non-diseased groups. Correlated biomarker data arises in study designs that include subjects that contain same genetic or environmental factors. The information about correlation might help to identify family members at increased risk of disease development, and may lead to initiating treatment to slow or stop the progression to disease. Approaches appropriate to a case-control design matched by family identification, must be able to accommodate both the correlation inherent in the design in correctly estimating the biomarker’s ability to differentiate between cases and controls, as well as to handle estimation from a matched case control design. This talk will review some developed methods for ROC curve estimation in settings with correlated data from case control design and will discuss the limitations of current methods for analyzing correlated familial paired data. An alternative approach using Conditional ROC curves will be demonstrated, to provide appropriate ROC curves for correlated paired data. The proposed approach will use the information about the correlation among biomarker values, producing conditional ROC curves that evaluate the ability of a biomarker to discriminate between diseased and non-diseased subjects in a familial paired design.

Keywords: biomarker, correlation, familial paired design, ROC curve

Procedia PDF Downloads 241
24033 Code-Switching among Local UCSI Stem and N-Stem Undergraduates during Knowledge Sharing

Authors: Adeela Abu Bakar, Minder Kaur, Parthaman Singh

Abstract:

In the Malaysian education system, a formal setting of English language learning takes place in a content-based classroom (CBC). Until recently, there is less study in Malaysia, which researched the effects of code-switching (CS) behaviour towards the students’ knowledge sharing (KS) with their peers. The aim of this study is to investigate the frequency, reasons, and effect that CS, from the English language to Bahasa Melayu, has among local STEM and N-STEM undergraduates towards KS in a content-based classroom. The study implies a mixed-method research design with questionnaire and interviews as the instruments. The data is collected through distribution of questionnaires and interviews with the undergraduates. The quantitative data is analysed using SPSS in simple frequencies and percentages, whereas qualitative data involves organizing the data into themes, followed by analysis. Findings found that N-STEM undergraduates code-switch more as compared to STEM undergraduates. In addition to that, both the STEM and N-STEM undergraduates agree that CS acts as a catalyst towards KS in a content-based classroom. However, they also acknowledge that excess use of CS can be a hindrance towards KS. The findings of the study can benefit STEM and N-STEM undergraduates, education policymakers, language teachers, university educators, and students with significant insights into the role of CS towards KS in a content-based classroom. Some of the recommendations that can be applied for future studies are that the number of participants can be increased, an observation to be included for the data collection.

Keywords: switching, content-based classroom, content and language integrated learning, knowledge sharing, STEM and N-STEM undergraduates

Procedia PDF Downloads 139
24032 Economics of Fish-Plantain Integrated Farm Enterprise in Southern Nigeria

Authors: S. O. Obasa, J. A. Soaga, O. I. Afolabi, N. A. Bamidele, O. E. Babalola

Abstract:

Attempt to improve the income of the rural population is a welcome development in Nigeria. Integrated fish-crop farming has been suggested as a means of raising farm income, reducing wastage and mitigating the risk component in production through the complementarity gain. A feeding trial was carried out to investigate the replacement of maize with fermented unripe plantain (Musa paradisiaca) peel meal in the diet of Nile tilapia, Oreochromis niloticus. The economics of the integrated enterprise was assessed using budgetary analysis techniques. The analysis incorporated the material and labour costs as well as the returns from sale of matured fish and plantain. A total of 60 fingerlings of Nile tilapia (1.70±0.1 g) were stocked at 10 per plastic tank. Two iso-nitrogenous diets containing 35% crude protein in which maize meal was replaced by fermented unripe plantain peel meal at 0% (FUP0/Control diet), and 100% (FUP100) were formulated and prepared. The fingerlings were fed at 5% body weight per day for 56 days. Lowest feed conversion ratio of 1.39 in fish fed diet FUP100 was not significantly different (P > 0.05) from the highest 1.42 of fish fed the Control diet. The highest percentage profit of 88.85% in fish fed diet FUP100 was significantly higher than 66.68% in fish fed diet FUP0, while the profit index of 1.89 in fish fed diet FUP100 was significantly different from 1.67 in fish fed diet FUP0. Therefore, fermented unripe plantain peel meal can completely replace maize in the diet of O. niloticus fingerlings. Profitability assessment shows that the net income from the integration was ₦ 463,000 per hectare and the integration resulted to an increase of ₦ 87,750.00 representing a 12.2% increase than in separate production.

Keywords: fish-crop, income, Nile tilapia, waste management

Procedia PDF Downloads 514
24031 Fuzzy Multi-Component DEA with Shared and Undesirable Fuzzy Resources

Authors: Jolly Puri, Shiv Prasad Yadav

Abstract:

Multi-component data envelopment analysis (MC-DEA) is a popular technique for measuring aggregate performance of the decision making units (DMUs) along with their components. However, the conventional MC-DEA is limited to crisp input and output data which may not always be available in exact form. In real life problems, data may be imprecise or fuzzy. Therefore, in this paper, we propose (i) a fuzzy MC-DEA (FMC-DEA) model in which shared and undesirable fuzzy resources are incorporated, (ii) the proposed FMC-DEA model is transformed into a pair of crisp models using cut approach, (iii) fuzzy aggregate performance of a DMU and fuzzy efficiencies of components are defined to be fuzzy numbers, and (iv) a numerical example is illustrated to validate the proposed approach.

Keywords: multi-component DEA, fuzzy multi-component DEA, fuzzy resources, decision making units (DMUs)

Procedia PDF Downloads 413
24030 A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.

Keywords: pattern recognition, global terrorism database, Manhattan distance, k-means clustering, terrorism data analysis

Procedia PDF Downloads 389
24029 AniMoveMineR: Animal Behavior Exploratory Analysis Using Association Rules Mining

Authors: Suelane Garcia Fontes, Silvio Luiz Stanzani, Pedro L. Pizzigatti Corrła Ronaldo G. Morato

Abstract:

Environmental changes and major natural disasters are most prevalent in the world due to the damage that humanity has caused to nature and these damages directly affect the lives of animals. Thus, the study of animal behavior and their interactions with the environment can provide knowledge that guides researchers and public agencies in preservation and conservation actions. Exploratory analysis of animal movement can determine the patterns of animal behavior and with technological advances the ability of animals to be tracked and, consequently, behavioral studies have been expanded. There is a lot of research on animal movement and behavior, but we note that a proposal that combines resources and allows for exploratory analysis of animal movement and provide statistical measures on individual animal behavior and its interaction with the environment is missing. The contribution of this paper is to present the framework AniMoveMineR, a unified solution that aggregates trajectory analysis and data mining techniques to explore animal movement data and provide a first step in responding questions about the animal individual behavior and their interactions with other animals over time and space. We evaluated the framework through the use of monitored jaguar data in the city of Miranda Pantanal, Brazil, in order to verify if the use of AniMoveMineR allows to identify the interaction level between these jaguars. The results were positive and provided indications about the individual behavior of jaguars and about which jaguars have the highest or lowest correlation.

Keywords: data mining, data science, trajectory, animal behavior

Procedia PDF Downloads 149
24028 A Study on Using Network Coding for Packet Transmissions in Wireless Sensor Networks

Authors: Rei-Heng Cheng, Wen-Pinn Fang

Abstract:

A wireless sensor network (WSN) is composed by a large number of sensors and one or a few base stations, where the sensor is responsible for detecting specific event information, which is sent back to the base station(s). However, how to save electricity consumption to extend the network lifetime is a problem that cannot be ignored in the wireless sensor networks. Since the sensor network is used to monitor a region or specific events, how the information can be reliably sent back to the base station is surly important. Network coding technique is often used to enhance the reliability of the network transmission. When a node needs to send out M data packets, it encodes these data with redundant data and sends out totally M + R packets. If the receiver can get any M packets out from these M + R packets, it can decode and get the original M data packets. To transmit redundant packets will certainly result in the excess energy consumption. This paper will explore relationship between the quality of wireless transmission and the number of redundant packets. Hopefully, each sensor can overhear the nearby transmissions, learn the wireless transmission quality around it, and dynamically determine the number of redundant packets used in network coding.

Keywords: energy consumption, network coding, transmission reliability, wireless sensor networks

Procedia PDF Downloads 398
24027 Pattern the Location and Area of Earth-Dumping Stations from Vehicle GPS Data in Taiwan

Authors: Chun-Yuan Chen, Ming-Chang Li, Xiu-Hui Wen, Yi-Ching Tu

Abstract:

The objective of this study explores GPS (Global Positioning System) applied to trace construction vehicles such as trucks or cranes, help to pattern the earth-dumping stations of traffic construction in Taiwan. Traffic construction in this research is defined as the engineering of high-speed railways, expressways, and which that distance more than kilometers. Audit the location and check the compliance with regulations of earth-dumping stations is one of important tasks in Taiwan EPA. Basically, the earth-dumping station was known as one source of particulate matter from air pollution during construction process. Due to GPS data can be analyzed quickly and be used conveniently, this study tried to find out dumping stations by modeling vehicles tracks from GPS data during work cycle of construction. The GPS data updated from 13 vehicles related to an expressway construction in central Taiwan. The GPS footprints were retrieved to Keyhole Markup Language (KML) files so that can pattern the tracks of trucks by computer applications, the data was collected about eight months- from Feb. to Oct. in 2017. The results of GPS footprints identified dumping station and outlined the areas of earthwork had been passed to the Taiwan EPA for on-site inspection. Taiwan EPA had issued advice comments to the agency which was in charge of the construction to prevent the air pollution. According to the result of this study compared to the commonly methods in inspecting environment by manual collection, the GPS with KML patterning and modeling method can consumes less time. On the other hand, through monitoring the GPS data from construction vehicles could be useful for administration to development and implementation of strategies in environmental management.

Keywords: automatic management, earth-dumping station, environmental management, Global Positioning System (GPS), particulate matter, traffic construction

Procedia PDF Downloads 167
24026 Designing and Implementing a Tourist-Guide Web Service Based on Volunteer Geographic Information Using Open-Source Technologies

Authors: Javad Sadidi, Ehsan Babaei, Hani Rezayan

Abstract:

The advent of web 2.0 gives a possibility to scale down the costs of data collection and mapping, specifically if the process is done by volunteers. Every volunteer can be thought of as a free and ubiquitous sensor to collect spatial, descriptive as well as multimedia data for tourist services. The lack of large-scale information, such as real-time climate and weather conditions, population density, and other related data, can be considered one of the important challenges in developing countries for tourists to make the best decision in terms of time and place of travel. The current research aims to design and implement a spatiotemporal web map service using volunteer-submitted data. The service acts as a tourist-guide service in which tourists can search interested places based on their requested time for travel. To design the service, three tiers of architecture, including data, logical processing, and presentation tiers, have been utilized. For implementing the service, open-source software programs, client and server-side programming languages (such as OpenLayers2, AJAX, and PHP), Geoserver as a map server, and Web Feature Service (WFS) standards have been used. The result is two distinct browser-based services, one for sending spatial, descriptive, and multimedia volunteer data and another one for tourists and local officials. Local official confirms the veracity of the volunteer-submitted information. In the tourist interface, a spatiotemporal search engine has been designed to enable tourists to find a tourist place based on province, city, and location at a specific time of interest. Implementing the tourist-guide service by this methodology causes the following: the current tourists participate in a free data collection and sharing process for future tourists, a real-time data sharing and accessing for all, avoiding a blind selection of travel destination and significantly, decreases the cost of providing such services.

Keywords: VGI, tourism, spatiotemporal, browser-based, web mapping

Procedia PDF Downloads 103
24025 Hydrogen Production from Solid Waste of Sago Processing Industries in Indonesia: Effect of Chemical and Biological Pretreatment

Authors: Pratikno Hidayat, Khamdan Cahyari

Abstract:

Hydrogen is the ultimate choice of energy carriers in future. It contents high energy density (42 kJ/g), emits only water vapor during combustion and has high energy conversion up to 50% in fuel cell application. One of the promising methods to produce hydrogen is from organic waste through dark fermentation method. It utilizes sugar-rich organic waste as substrate and hydrogen-producing microorganisms to generate the hydrogen. Solid waste of sago processing industries in Indonesia is one of the promising raw materials for both producing biofuel hydrogen and mitigating the environmental impact due to the waste disposal. This research was meant to investigate the effect of chemical and biological pretreatment i.e. acid treatment and mushroom cultivation toward lignocellulosic waste of these sago industries. Chemical pretreatment was conducted through exposing the waste into acid condition using sulfuric acid (H2SO4) (various molar i.e. 0.2, 0.3, and 0.4 M and various duration of exposure i.e. 30, 60 and 90 minutes). Meanwhile, biological treatment was conducted through utilization of the solid waste as growth media of mushroom (Oyster and Ling-zhi) for 3 months. Dark fermentation was conducted at pH 5.0, temperature 27℃ and atmospheric pressure. It was noticed that chemical and biological pretreatment could improve hydrogen yield with the highest yield at 3.8 ml/g VS (31%v H2). The hydrogen production was successfully performed to generate high percentage of hydrogen, although the yield was still low. This result indicated that the explosion of acid chemical and biological method might need to be extended to improve degradability of the solid waste. However, high percentage of hydrogen was resulted from proper pretreatment of residual sludge of biogas plant to generate hydrogen-producing inoculum.

Keywords: hydrogen, sago waste, chemical, biological, dark fermentation, Indonesia

Procedia PDF Downloads 369
24024 Effect of Bank Specific and Macro Economic Factors on Credit Risk of Islamic Banks in Pakistan

Authors: Mati Ullah, Shams Ur Rahman

Abstract:

The purpose of this research study is to investigate the effect of macroeconomic and bank-specific factors on credit risk in Islamic banking in Pakistan. The future of financial institutions largely depends on how well they manage risks. Credit risk is an important type of risk affecting the banking sector. The current study has taken quarterly data for the period of 6 years, from 1st July 2014 to 30 Jun 2020. The data set consisted of secondary data. Data was extracted from the websites of the State Bank and World Bank and from the financial statements of the concerned banks. In this study, the Ordinary least square model was used for the analysis of the data. The results supported the hypothesis that macroeconomic factors and bank-specific factors have a significant effect on credit risk. Macroeconomic variables, Inflation and exchange rates have positive significant effects on credit risk. However, gross domestic product has a negative significant relationship with credit risk. Moreover, the corporate rate has no significant relation with credit risk. Internal variables, size, management efficiency, net profit share income and capital adequacy have been proven to influence positively and significantly the credit risk. However, loan to deposit-has a negative insignificance relationship with credit risk. The contribution of this article is that similar conclusions have been made regarding the influence of banking factors on credit risk.

Keywords: credit risk, Islamic banks, macroeconomic variables, banks specific variable

Procedia PDF Downloads 23
24023 Effects of Vitamin C and Spondias mombin Supplementation on Hematology, Growth, Egg Production Traits, and Eggshell Quality in Japanese Quails (Coturnix coturnix japonica) in a Hot-Humid Tropics

Authors: B. O. Oyebanji, I. O. Dudusola, C. T. Ademola, S. A. Olaniyan

Abstract:

A 56 day study was conducted to evaluate the effect of dietary inclusion of Spondias mombin on hematological, growth, egg parameters and egg shell quality of Japanese quails, Cortunix cortunix japonica. One hundred birds were used for this study, and they were allocated randomly into 5 groups and replicated twice. Group 1 animals served as control without inclusion of extract, groups 2, 3, and 4 had 200 mg/kg, 400 mg/kg and 800 mg/kg inclusion of SM, group 5 had 600 mg/kg of vitamin C respectively. The birds were weighed weekly to determine weight change, the blood parameters analyzed at the completion of the experiment were PCV, Hb, RBC WBC, differential WBC count, MCH, MCH, and MCV were afterwards calculated from these parameters. 5 eggs were collected from each group and egg weight, eggshell weight, eggshell diameter, yolk weight, albumen weight, yolk diameter, yolk height, albumen percentage, yolk percentage and shell percentage were determined. There was no significant difference among the group for the hematological parameters measured and calculated. The egg weight and albumen weight of quails on 800 mg/kg was highest of all the groups, all other egg parameters measured showed no significant difference. The birds supplemented with Vitamin C had the highest weight gain (40.8±2.5 g) and the lowest feed conversion ratio (2.25). There was no mortality recorded in all the groups except in the SM800 group with 10% mortality. It can be concluded from this experiment that Vitamin C supplementation has positive effect on quail production in humid tropics and the inclusion of Spondias mombin leaf extract has a dose-dependent toxicity in quails.

Keywords: hematology, quails, Spondias mombin, vitamin C

Procedia PDF Downloads 359
24022 Non-Parametric Regression over Its Parametric Couterparts with Large Sample Size

Authors: Jude Opara, Esemokumo Perewarebo Akpos

Abstract:

This paper is on non-parametric linear regression over its parametric counterparts with large sample size. Data set on anthropometric measurement of primary school pupils was taken for the analysis. The study used 50 randomly selected pupils for the study. The set of data was subjected to normality test, and it was discovered that the residuals are not normally distributed (i.e. they do not follow a Gaussian distribution) for the commonly used least squares regression method for fitting an equation into a set of (x,y)-data points using the Anderson-Darling technique. The algorithms for the nonparametric Theil’s regression are stated in this paper as well as its parametric OLS counterpart. The use of a programming language software known as “R Development” was used in this paper. From the analysis, the result showed that there exists a significant relationship between the response and the explanatory variable for both the parametric and non-parametric regression. To know the efficiency of one method over the other, the Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) are used, and it is discovered that the nonparametric regression performs better than its parametric regression counterparts due to their lower values in both the AIC and BIC. The study however recommends that future researchers should study a similar work by examining the presence of outliers in the data set, and probably expunge it if detected and re-analyze to compare results.

Keywords: Theil’s regression, Bayesian information criterion, Akaike information criterion, OLS

Procedia PDF Downloads 310