Search results for: 2d and 3d data conversion
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26322

Search results for: 2d and 3d data conversion

25002 Optimizing Electric Vehicle Charging with Charging Data Analytics

Authors: Tayyibah Khanam, Mohammad Saad Alam, Sanchari Deb, Yasser Rafat

Abstract:

Electric vehicles are considered as viable replacements to gasoline cars since they help in reducing harmful emissions and stimulate power generation through renewable energy sources, hence contributing to sustainability. However, one of the significant obstacles in the mass deployment of electric vehicles is the charging time anxiety among users and, thus, the subsequent large waiting times for available chargers at charging stations. Data analytics, on the other hand, has revolutionized the decision-making tasks of management and operating systems since its arrival. In this paper, we attempt to optimize the choice of EV charging stations for users in their vicinity by minimizing the time taken to reach the charging stations and the waiting times for available chargers. Time taken to travel to the charging station is calculated by the Google Maps API and the waiting times are predicted by polynomial regression of the historical data stored. The proposed framework utilizes real-time data and historical data from all operating charging stations in the city and assists the user in finding the best suitable charging station for their current situation and can be implemented in a mobile phone application. The algorithm successfully predicts the most optimal choice of a charging station and the minimum required time for various sample data sets.

Keywords: charging data, electric vehicles, machine learning, waiting times

Procedia PDF Downloads 200
25001 Finding Data Envelopment Analysis Targets Using Multi-Objective Programming in DEA-R with Stochastic Data

Authors: R. Shamsi, F. Sharifi

Abstract:

In this paper, we obtain the projection of inefficient units in data envelopment analysis (DEA) in the case of stochastic inputs and outputs using the multi-objective programming (MOP) structure. In some problems, the inputs might be stochastic while the outputs are deterministic, and vice versa. In such cases, we propose a multi-objective DEA-R model because in some cases (e.g., when unnecessary and irrational weights by the BCC model reduce the efficiency score), an efficient decision-making unit (DMU) is introduced as inefficient by the BCC model, whereas the DMU is considered efficient by the DEA-R model. In some other cases, only the ratio of stochastic data may be available (e.g., the ratio of stochastic inputs to stochastic outputs). Thus, we provide a multi-objective DEA model without explicit outputs and prove that the input-oriented MOP DEA-R model in the invariable return to scale case can be replaced by the MOP-DEA model without explicit outputs in the variable return to scale and vice versa. Using the interactive methods for solving the proposed model yields a projection corresponding to the viewpoint of the DM and the analyst, which is nearer to reality and more practical. Finally, an application is provided.

Keywords: DEA-R, multi-objective programming, stochastic data, data envelopment analysis

Procedia PDF Downloads 111
25000 Electrospinning in situ Synthesis of Graphene-Doped Copper Indium Disulfide Composite Nanofibers for Efficient Counter Electrode in Dye-Sensitized Solar Cells

Authors: Lidan Wang, Shuyuan Zhao, Jianxin He

Abstract:

In this paper, graphene-doped copper indium disulfide (rGO+CuInS2) composite nanofibers were fabricated via electrospinning, in situ synthesis, and carbonization, using polyvinyl pyrrolidone (PVP), copper dichloride (CuCl2), indium trichloride (InCl3), thiourea (C2H5NS) and graphene oxide nanosheets (Go) as the precursor solution for electrospinning. The average diameter of rGO+CuInS2 nanofibers were about 100 nm, and graphene nanosheets anchored with chalcopyrite CuInS2 nanocrystals 8-15 nm in diameter were overlapped and embedded, aligning along the fiber axial direction. The DSSC with a rGO+CuInS2 counter electrode exhibits a power conversion efficiency of 5.93%; better than the corresponding values for a DSSC with a CuInS2 counter electrode, and comparable to that of a reference DSSC with a Pt counter electrode. The excellent photoelectric performance of the rGO+CuInS2 counter electrode was attributed to its high specific surface area, which facilitated permeation of the liquid electrolytes, promoted electron and ion transfer and provided numerous catalytically active sites for the oxidation reaction of the electrolytic (I- /I3-).

Keywords: dye-sensitized solar cells, counter electrode, electrospinning, graphene

Procedia PDF Downloads 461
24999 Integrated Model for Enhancing Data Security Processing Time in Cloud Computing

Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali

Abstract:

Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a simple user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.

Keywords: cloud computing, data security, SAAS, PAAS, IAAS, Blowfish

Procedia PDF Downloads 361
24998 Comparison of Statistical Methods for Estimating Missing Precipitation Data in the River Subbasin Lenguazaque, Colombia

Authors: Miguel Cañon, Darwin Mena, Ivan Cabeza

Abstract:

In this work was compared and evaluated the applicability of statistical methods for the estimation of missing precipitations data in the basin of the river Lenguazaque located in the departments of Cundinamarca and Boyacá, Colombia. The methods used were the method of simple linear regression, distance rate, local averages, mean rates, correlation with nearly stations and multiple regression method. The analysis used to determine the effectiveness of the methods is performed by using three statistical tools, the correlation coefficient (r2), standard error of estimation and the test of agreement of Bland and Altmant. The analysis was performed using real rainfall values removed randomly in each of the seasons and then estimated using the methodologies mentioned to complete the missing data values. So it was determined that the methods with the highest performance and accuracy in the estimation of data according to conditions that were counted are the method of multiple regressions with three nearby stations and a random application scheme supported in the precipitation behavior of related data sets.

Keywords: statistical comparison, precipitation data, river subbasin, Bland and Altmant

Procedia PDF Downloads 470
24997 Production of Antimicrobial Agents against Multidrug-Resistant Staphylococcus aureus through the Biocatalysis of Vegetable Oils

Authors: Hak-Ryul Kim, Hyung-Geun Lee, Qi Long, Ching Hou

Abstract:

Structural modification of natural lipids via chemical reaction or microbial bioconversion can change their properties or even create novel functionalities. Enzymatic oxidation of lipids leading to formation of oxylipin is one of those modifications. Hydroxy fatty acids, one of those oxylipins have gained important attentions because of their structural and functional properties compared with other non-hydroxy fatty acids. Recently 7,10-dihydroxy-8(E)-octadecenoic acid (DOD) was produced with high yield from lipid-containing oleic acid by microbial conversion, and the further study confirmed that DOD contained strong antimicrobial activities against a broad range of microorganisms. In this study, we tried to modify DOD molecules by the enzymatic or physical reaction to create new functionality or to enhance the antimicrobial activity of DOD. After modification of DOD molecules by different ways, we confirmed that the antimicrobial activity of DOD was highly enhanced and presented strong antimicrobial activities against multidrug-resistant Staphylococcus aureus, suggesting that DOD and its derivatives can be used as efficient antimicrobial agents for medical and industrial applications.

Keywords: biocatalysis, antimicrobial agent, multidrug-resistant bacteria, vegetable oil

Procedia PDF Downloads 209
24996 Hyperspectral Data Classification Algorithm Based on the Deep Belief and Self-Organizing Neural Network

Authors: Li Qingjian, Li Ke, He Chun, Huang Yong

Abstract:

In this paper, the method of combining the Pohl Seidman's deep belief network with the self-organizing neural network is proposed to classify the target. This method is mainly aimed at the high nonlinearity of the hyperspectral image, the high sample dimension and the difficulty in designing the classifier. The main feature of original data is extracted by deep belief network. In the process of extracting features, adding known labels samples to fine tune the network, enriching the main characteristics. Then, the extracted feature vectors are classified into the self-organizing neural network. This method can effectively reduce the dimensions of data in the spectrum dimension in the preservation of large amounts of raw data information, to solve the traditional clustering and the long training time when labeled samples less deep learning algorithm for training problems, improve the classification accuracy and robustness. Through the data simulation, the results show that the proposed network structure can get a higher classification precision in the case of a small number of known label samples.

Keywords: DBN, SOM, pattern classification, hyperspectral, data compression

Procedia PDF Downloads 344
24995 Assessing Performance of Data Augmentation Techniques for a Convolutional Network Trained for Recognizing Humans in Drone Images

Authors: Masood Varshosaz, Kamyar Hasanpour

Abstract:

In recent years, we have seen growing interest in recognizing humans in drone images for post-disaster search and rescue operations. Deep learning algorithms have shown great promise in this area, but they often require large amounts of labeled data to train the models. To keep the data acquisition cost low, augmentation techniques can be used to create additional data from existing images. There are many techniques of such that can help generate variations of an original image to improve the performance of deep learning algorithms. While data augmentation is potentially assumed to improve the accuracy and robustness of the models, it is important to ensure that the performance gains are not outweighed by the additional computational cost or complexity of implementing the techniques. To this end, it is important to evaluate the impact of data augmentation on the performance of the deep learning models. In this paper, we evaluated the most currently available 2D data augmentation techniques on a standard convolutional network which was trained for recognizing humans in drone images. The techniques include rotation, scaling, random cropping, flipping, shifting, and their combination. The results showed that the augmented models perform 1-3% better compared to a base network. However, as the augmented images only contain the human parts already visible in the original images, a new data augmentation approach is needed to include the invisible parts of the human body. Thus, we suggest a new method that employs simulated 3D human models to generate new data for training the network.

Keywords: human recognition, deep learning, drones, disaster mitigation

Procedia PDF Downloads 101
24994 Efficacy of Opicapone and Levodopa with Different Levodopa Daily Doses in Parkinson’s Disease Patients with Early Motor Fluctuations: Findings from the Korean ADOPTION Study

Authors: Jee-Young Lee, Joaquim J. Ferreira, Hyeo-il Ma, José-Francisco Rocha, Beomseok Jeon

Abstract:

The effective management of wearing-off is a key driver of medication changes for patients with Parkinson’s disease (PD) treated with levodopa (L-DOPA). While L-DOPA is well tolerated and efficacious, its clinical utility over time is often limited by the development of complications such as dyskinesia. Still, common first-line option includes adjusting the daily L-DOPA dose followed by adjunctive therapies usually counting for the L-DOPA equivalent daily dose (LEDD). The LEDD conversion formulae are a tool used to compare the equivalence of anti-PD medications. The aim of this work is to compare the effects of opicapone (OPC) 50 mg, a catechol-O-methyltransferase (COMT) inhibitor, and an additional 100 mg dose of L-DOPA in reducing the off time in PD patients with early motor fluctuations receiving different daily L-DOPA doses. OPC was found to be well tolerated and efficacious in advanced PD population. This work utilized patients' home diary data from a 4-week Phase 2 pharmacokinetics clinical study. The Korean ADOPTION study randomized (1:1) patients with PD and early motor fluctuations treated with up to 600 mg of L-DOPA given 3–4 times daily. The main endpoint was change from baseline in off time in the subgroup of patients receiving 300–400 mg/day L-DOPA at baseline plus OPC 50 mg and in the subgroup receiving >300 mg/day L-DOPA at baseline plus an additional dose of L-DOPA 100 mg. Of the 86 patients included in this subgroup analysis, 39 received OPC 50 mg and 47 L-DOPA 100 mg. At baseline, both L-DOPA total daily dose and LEDD were lower in the L-DOPA 300–400 mg/day plus OPC 50 mg group than in the L-DOPA >300 mg/day plus L-DOPA 100 mg. However, at Week 4, LEDD was similar between the two groups. The mean (±standard error) reduction in off time was approximately three-fold greater for the OPC 50 mg than for the L-DOPA 100 mg group, being -63.0 (14.6) minutes for patients treated with L-DOPA 300–400 mg/day plus OPC 50 mg, and -22.1 (9.3) minutes for those receiving L-DOPA >300 mg/day plus L-DOPA 100 mg. In conclusion, despite similar LEDD, OPC demonstrated a significantly greater reduction in off time when compared to an additional 100 mg L-DOPA dose. The effect of OPC appears to be LEDD independent, suggesting that caution should be exercised when employing LEDD to guide treatment decisions as this does not take into account the timing of each dose, onset, duration of therapeutic effect and individual responsiveness. Additionally, OPC could be used for keeping the L-DOPA dose as low as possible for as long as possible to avoid the development of motor complications which are a significant source of disability.

Keywords: opicapone, levodopa, pharmacokinetics, off-time

Procedia PDF Downloads 67
24993 Combined Influence of Charge Carrier Density and Temperature on Open-Circuit Voltage in Bulk Heterojunction Organic Solar Cells

Authors: Douglas Yeboah, Monishka Narayan, Jai Singh

Abstract:

One of the key parameters in determining the power conversion efficiency (PCE) of organic solar cells (OSCs) is the open-circuit voltage, however, it is still not well understood. In order to examine the performance of OSCs, it is necessary to understand the losses associated with the open-circuit voltage and how best it can be improved. Here, an analytical expression for the open-circuit voltage of bulk heterojunction (BHJ) OSCs is derived from the charge carrier densities without considering the drift-diffusion current. The open-circuit voltage thus obtained is dependent on the donor-acceptor band gap, the energy difference between the highest occupied molecular orbital (HOMO) and the hole quasi-Fermi level of the donor material, temperature, the carrier density (electrons), the generation rate of free charge carriers and the bimolecular recombination coefficient. It is found that open-circuit voltage increases when the carrier density increases and when the temperature decreases. The calculated results are discussed in view of experimental results and agree with them reasonably well. Overall, this work proposes an alternative pathway for improving the open-circuit voltage in BHJ OSCs.

Keywords: charge carrier density, open-circuit voltage, organic solar cells, temperature

Procedia PDF Downloads 379
24992 Valorization of Gypsum as Industrial Waste

Authors: Hasna Soli

Abstract:

The main objective of this work is the extraction of sulfur from gypsum here is industrial waste. Indeed the sulfuric acid production, passing through the following process; melting sulfur, filtration of the liquid sulfur, sulfur combustion to produce SO₂, conversion of SO₂ to SO₃ and SO₃ absorption in water to produce H₂SO₄ product as waste CaSO₄ the anhydrous calcium sulfate. The main objectives of this work are improving the industrial practices and to find other ways to manage these solid wastes. It should also assess the consequences of treatment in terms of training and become byproducts. Firstly there will be a characterization of this type of waste by an X-ray diffraction; to obtain phase solid compositions and chemical analysis; gravimetrically and atomic absorption spectrometry or by ICP. The samples are mineralized in suitable acidic or basic solutions. The elements analyzed are CaO, Sulfide (SO₃), Al₂O₃, Fe₂O₃, MgO, SiO₂. Then an analysis by EDS energy dispersive spectrometry using an Oxford EDX probe and differential thermal and gravimetric analyzes. Gypsum’s valuation will be performed. Indeed, the CaSO₄ will be reused to produce sulfuric acid, which will be reintroduced into the production line. The second approach explored in this work is the thermal utilization of solid waste to remove sulfur as a dilute sulfuric acid solution.

Keywords: environment, gypsum, sulfur, waste

Procedia PDF Downloads 303
24991 Insight Into Database Forensics

Authors: Enas K., Fatimah A., Abeer A., Ghadah A.

Abstract:

Database forensics is a specialized field of digital forensics that investigates and analyzes database systems to recover and evaluate data, particularly in cases of cyberattacks and data breaches. The increasing significance of securing data confidentiality, integrity, and availability has emphasized the need for robust forensic models to preserve data integrity and maintain the chain of evidence. Organizations rely on Database Forensic Investigation (DBFI) to protect critical data, maintain trust, and support legal actions in the event of breaches. To address the complexities of relational and non-relational databases, structured forensic frameworks and tools have been developed. These include the Three-Tier Database Forensic Model (TT-DF) for comprehensive investigations, blockchain-backed logging systems for enhanced evidence reliability, and the FORC tool for mobile SQLite database forensics. Such advancements facilitate data recovery, identify unauthorized access, and reconstruct events for legal proceedings. Practical demonstrations of these tools and frameworks further illustrate their real-world applicability, advancing the effectiveness of database forensics in mitigating modern cybersecurity threats.

Keywords: database forensics, cybersecurity, SQLite forensics, digital forensics

Procedia PDF Downloads 10
24990 Emotional Artificial Intelligence and the Right to Privacy

Authors: Emine Akar

Abstract:

The majority of privacy-related regulation has traditionally focused on concepts that are perceived to be well-understood or easily describable, such as certain categories of data and personal information or images. In the past century, such regulation appeared reasonably suitable for its purposes. However, technologies such as AI, combined with ever-increasing capabilities to collect, process, and store “big data”, not only require calibration of these traditional understandings but may require re-thinking of entire categories of privacy law. In the presentation, it will be explained, against the background of various emerging technologies under the umbrella term “emotional artificial intelligence”, why modern privacy law will need to embrace human emotions as potentially private subject matter. This argument can be made on a jurisprudential level, given that human emotions can plausibly be accommodated within the various concepts that are traditionally regarded as the underlying foundation of privacy protection, such as, for example, dignity, autonomy, and liberal values. However, the practical reasons for regarding human emotions as potentially private subject matter are perhaps more important (and very likely more convincing from the perspective of regulators). In that respect, it should be regarded as alarming that, according to most projections, the usefulness of emotional data to governments and, particularly, private companies will not only lead to radically increased processing and analysing of such data but, concerningly, to an exponential growth in the collection of such data. In light of this, it is also necessity to discuss options for how regulators could address this emerging threat.

Keywords: AI, privacy law, data protection, big data

Procedia PDF Downloads 91
24989 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform

Authors: Reza Mohammadzadeh

Abstract:

The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.

Keywords: data model, geotechnical risks, machine learning, underground coal mining

Procedia PDF Downloads 280
24988 Classification of Poverty Level Data in Indonesia Using the Naïve Bayes Method

Authors: Anung Style Bukhori, Ani Dijah Rahajoe

Abstract:

Poverty poses a significant challenge in Indonesia, requiring an effective analytical approach to understand and address this issue. In this research, we applied the Naïve Bayes classification method to examine and classify poverty data in Indonesia. The main focus is on classifying data using RapidMiner, a powerful data analysis platform. The analysis process involves data splitting to train and test the classification model. First, we collected and prepared a poverty dataset that includes various factors such as education, employment, and health..The experimental results indicate that the Naïve Bayes classification model can provide accurate predictions regarding the risk of poverty. The use of RapidMiner in the analysis process offers flexibility and efficiency in evaluating the model's performance. The classification produces several values to serve as the standard for classifying poverty data in Indonesia using Naive Bayes. The accuracy result obtained is 40.26%, with a moderate recall result of 35.94%, a high recall result of 63.16%, and a low recall result of 38.03%. The precision for the moderate class is 58.97%, for the high class is 17.39%, and for the low class is 58.70%. These results can be seen from the graph below.

Keywords: poverty, classification, naïve bayes, Indonesia

Procedia PDF Downloads 65
24987 Web Search Engine Based Naming Procedure for Independent Topic

Authors: Takahiro Nishigaki, Takashi Onoda

Abstract:

In recent years, the number of document data has been increasing since the spread of the Internet. Many methods have been studied for extracting topics from large document data. We proposed Independent Topic Analysis (ITA) to extract topics independent of each other from large document data such as newspaper data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis. The topic represented by ITA is represented by a set of words. However, the set of words is quite different from the topics the user imagines. For example, the top five words with high independence of a topic are as follows. Topic1 = {"scor", "game", "lead", "quarter", "rebound"}. This Topic 1 is considered to represent the topic of "SPORTS". This topic name "SPORTS" has to be attached by the user. ITA cannot name topics. Therefore, in this research, we propose a method to obtain topics easy for people to understand by using the web search engine, topics given by the set of words given by independent topic analysis. In particular, we search a set of topical words, and the title of the homepage of the search result is taken as the topic name. And we also use the proposed method for some data and verify its effectiveness.

Keywords: independent topic analysis, topic extraction, topic naming, web search engine

Procedia PDF Downloads 124
24986 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas

Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards

Abstract:

Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.

Keywords: airborne laser scanning, digital terrain models, filtering, forested areas

Procedia PDF Downloads 142
24985 Estimating the Life-Distribution Parameters of Weibull-Life PV Systems Utilizing Non-Parametric Analysis

Authors: Saleem Z. Ramadan

Abstract:

In this paper, a model is proposed to determine the life distribution parameters of the useful life region for the PV system utilizing a combination of non-parametric and linear regression analysis for the failure data of these systems. Results showed that this method is dependable for analyzing failure time data for such reliable systems when the data is scarce.

Keywords: masking, bathtub model, reliability, non-parametric analysis, useful life

Procedia PDF Downloads 564
24984 Preliminary Design of Maritime Energy Management System: Naval Architectural Approach to Resolve Recent Limitations

Authors: Seyong Jeong, Jinmo Park, Jinhyoun Park, Boram Kim, Kyoungsoo Ahn

Abstract:

Energy management in the maritime industry is being required by economics and in conformity with new legislative actions taken by the International Maritime Organization (IMO) and the European Union (EU). In response, the various performance monitoring methodologies and data collection practices have been examined by different stakeholders. While many assorted advancements in operation and technology are applicable, their adoption in the shipping industry stays small. This slow uptake can be considered due to many different barriers such as data analysis problems, misreported data, and feedback problems, etc. This study presents a conceptual design of an energy management system (EMS) and proposes the methodology to resolve the limitations (e.g., data normalization using naval architectural evaluation, management of misrepresented data, and feedback from shore to ship through management of performance analysis history). We expect this system to make even short-term charterers assess the ship performance properly and implement sustainable fleet control.

Keywords: data normalization, energy management system, naval architectural evaluation, ship performance analysis

Procedia PDF Downloads 452
24983 Deproteination and Demineralization of Shrimp Waste Using Lactic Acid Bacteria for the Production of Crude Chitin and Chitosan

Authors: Farramae Francisco, Rhoda Mae Simora, Sharon Nunal

Abstract:

Deproteination and demineralization efficiencies of shrimp waste using two Lactobacillus species treated with different carbohydrate sources for chitin production, its chemical conversion to chitosan and the quality of chitin and chitosan produced were determined. Using 5% glucose and 5% cassava starch as carbohydrate sources, pH slightly increased from the initial pH of 6.0 to 6.8 and 7.2, respectively after 24 h and maintained their pH at 6.7 to 7.3 throughout the treatment period. Demineralization (%) in 5 % glucose and 5 % cassava was highest during the first day of treatment which was 82% and 83%, respectively. Deproteination (%) was highest in 5% cassava starch on the 3rd day of treatment at 84.4%. The obtained chitin from 5% cassava and 5% glucose had a residual ash and protein below 1% and solubility of 59% and 44.3%, respectively. Chitosan produced from 5% cassava and 5% glucose had protein content below 0.05%; residual ash was 1.1% and 0.8%, respectively. Chitosan solubility and degree of deacetylation were 56% and 33% in 5% glucose and 48% and 29% in 5% cassava, respectively. The advantage this alternative technology offers over that of chemical extraction is large reduction in chemicals needed thus less effluent production and generation of a protein-rich liquor, although the demineralization process should be improved to achieve greater degree of deacetylation.

Keywords: alternative carbon source, bioprocessing, lactic acid bacteria, waste utilization

Procedia PDF Downloads 491
24982 Geospatial Data Complexity in Electronic Airport Layout Plan

Authors: Shyam Parhi

Abstract:

Airports GIS program collects Airports data, validate and verify it, and stores it in specific database. Airports GIS allows authorized users to submit changes to airport data. The verified data is used to develop several engineering applications. One of these applications is electronic Airport Layout Plan (eALP) whose primary aim is to move from paper to digital form of ALP. The first phase of development of eALP was completed recently and it was tested for a few pilot program airports across different regions. We conducted gap analysis and noticed that a lot of development work is needed to fine tune at least six mandatory sheets of eALP. It is important to note that significant amount of programming is needed to move from out-of-box ArcGIS to a much customized ArcGIS which will be discussed. The ArcGIS viewer capability to display essential features like runway or taxiway or the perpendicular distance between them will be discussed. An enterprise level workflow which incorporates coordination process among different lines of business will be highlighted.

Keywords: geospatial data, geology, geographic information systems, aviation

Procedia PDF Downloads 421
24981 Investigate and Control Thermal Spectra in Nanostructures and 2D Van der Waals Materials

Authors: Joon Sang Kang, Ming Ke, Yongjie Hu

Abstract:

Controlling heat transfer and thermal properties of materials is important to many fields such as energy efficiency and thermal management of integrated circuits. Significant progress over the past decade has been made to improve material performance through structuring at the nanoscale, however a clear relationship between structure dimensions, interfaces, and thermal properties remains to be established. The main challenge comes from the unknown intrinsic spectral contribution from different phonons. Here, we describe our current progress on quantifying and controlling thermal spectra based on our recently developed technical approach using ultrafast optical spectroscopy. Our work brings further the promise of rational material design to achieve high performance through a synergistic experimental-modeling approach. This approach can be broadly applicable to a wide range of materials and energy systems. In particular, we demonstrate in-situ characterization and tunable thermal properties of 2D van der waals materials through ionic intercalations. The significant impacts of this research in improving the efficiency of thermal energy conversion and management will also be illustrated.

Keywords: energy, mean free path, nanoscale heat transfer, nanostructure, phonons, TDTR, thermoelectrics, 2D materials

Procedia PDF Downloads 293
24980 Anisotropic Total Fractional Order Variation Model in Seismic Data Denoising

Authors: Jianwei Ma, Diriba Gemechu

Abstract:

In seismic data processing, attenuation of random noise is the basic step to improve quality of data for further application of seismic data in exploration and development in different gas and oil industries. The signal-to-noise ratio of the data also highly determines quality of seismic data. This factor affects the reliability as well as the accuracy of seismic signal during interpretation for different purposes in different companies. To use seismic data for further application and interpretation, we need to improve the signal-to-noise ration while attenuating random noise effectively. To improve the signal-to-noise ration and attenuating seismic random noise by preserving important features and information about seismic signals, we introduce the concept of anisotropic total fractional order denoising algorithm. The anisotropic total fractional order variation model defined in fractional order bounded variation is proposed as a regularization in seismic denoising. The split Bregman algorithm is employed to solve the minimization problem of the anisotropic total fractional order variation model and the corresponding denoising algorithm for the proposed method is derived. We test the effectiveness of theproposed method for synthetic and real seismic data sets and the denoised result is compared with F-X deconvolution and non-local means denoising algorithm.

Keywords: anisotropic total fractional order variation, fractional order bounded variation, seismic random noise attenuation, split Bregman algorithm

Procedia PDF Downloads 211
24979 NSBS: Design of a Network Storage Backup System

Authors: Xinyan Zhang, Zhipeng Tan, Shan Fan

Abstract:

The first layer of defense against data loss is the backup data. This paper implements an agent-based network backup system used the backup, server-storage and server-backup agent these tripartite construction, and we realize the snapshot and hierarchical index in the NSBS. It realizes the control command and data flow separation, balances the system load, thereby improving the efficiency of the system backup and recovery. The test results show the agent-based network backup system can effectively improve the task-based concurrency, reasonably allocate network bandwidth, the system backup performance loss costs smaller and improves data recovery efficiency by 20%.

Keywords: agent, network backup system, three architecture model, NSBS

Procedia PDF Downloads 463
24978 A t-SNE and UMAP Based Neural Network Image Classification Algorithm

Authors: Shelby Simpson, William Stanley, Namir Naba, Xiaodi Wang

Abstract:

Both t-SNE and UMAP are brand new state of art tools to predominantly preserve the local structure that is to group neighboring data points together, which indeed provides a very informative visualization of heterogeneity in our data. In this research, we develop a t-SNE and UMAP base neural network image classification algorithm to embed the original dataset to a corresponding low dimensional dataset as a preprocessing step, then use this embedded database as input to our specially designed neural network classifier for image classification. We use the fashion MNIST data set, which is a labeled data set of images of clothing objects in our experiments. t-SNE and UMAP are used for dimensionality reduction of the data set and thus produce low dimensional embeddings. Furthermore, we use the embeddings from t-SNE and UMAP to feed into two neural networks. The accuracy of the models from the two neural networks is then compared to a dense neural network that does not use embedding as an input to show which model can classify the images of clothing objects more accurately.

Keywords: t-SNE, UMAP, fashion MNIST, neural networks

Procedia PDF Downloads 204
24977 An Online Adaptive Thresholding Method to Classify Google Trends Data Anomalies for Investor Sentiment Analysis

Authors: Duygu Dere, Mert Ergeneci, Kaan Gokcesu

Abstract:

Google Trends data has gained increasing popularity in the applications of behavioral finance, decision science and risk management. Because of Google’s wide range of use, the Trends statistics provide significant information about the investor sentiment and intention, which can be used as decisive factors for corporate and risk management fields. However, an anomaly, a significant increase or decrease, in a certain query cannot be detected by the state of the art applications of computation due to the random baseline noise of the Trends data, which is modelled as an Additive white Gaussian noise (AWGN). Since through time, the baseline noise power shows a gradual change an adaptive thresholding method is required to track and learn the baseline noise for a correct classification. To this end, we introduce an online method to classify meaningful deviations in Google Trends data. Through extensive experiments, we demonstrate that our method can successfully classify various anomalies for plenty of different data.

Keywords: adaptive data processing, behavioral finance , convex optimization, online learning, soft minimum thresholding

Procedia PDF Downloads 174
24976 Estimating the Power Influence of an Off-Grid Photovoltaic Panel on the Indicting Rate of a Storage System (Batteries)

Authors: Osamede Asowata

Abstract:

The current resurgence of interest in the use of renewable energy is driven by the need to reduce the high environmental impact of fossil-based energy. The aim of this paper is to evaluate the effect of a stationary PV panel on the charging rate of deep-cycle valve regulated lead-acid (DCVRLA) batteries. Stationary PV panels are set to a fixed tilt and orientation angle, which plays a major role in dictating the output power of a PV panel and subsequently on the charging time of a DCVRLA battery. In a basic PV system, an energy storage device that stores the power from the PV panel is necessary due to the fluctuating nature of the PV voltage caused by climatic conditions. The charging and discharging times of a DCVRLA battery were determined for a twelve month period from January through December 2012. Preliminary results, which include regression analysis (R2), conversion-time per week and work-time per day, indicate that a 36 degrees tilt angle produces a good charging rate for a latitude of 26 degrees south throughout the year.

Keywords: tilt and orientation angles, solar chargers, PV panels, storage devices, direct solar radiation.

Procedia PDF Downloads 242
24975 Energy Efficient Assessment of Energy Internet Based on Data-Driven Fuzzy Integrated Cloud Evaluation Algorithm

Authors: Chuanbo Xu, Xinying Li, Gejirifu De, Yunna Wu

Abstract:

Energy Internet (EI) is a new form that deeply integrates the Internet and the entire energy process from production to consumption. The assessment of energy efficient performance is of vital importance for the long-term sustainable development of EI project. Although the newly proposed fuzzy integrated cloud evaluation algorithm considers the randomness of uncertainty, it relies too much on the experience and knowledge of experts. Fortunately, the enrichment of EI data has enabled the utilization of data-driven methods. Therefore, the main purpose of this work is to assess the energy efficient of park-level EI by using a combination of a data-driven method with the fuzzy integrated cloud evaluation algorithm. Firstly, the indicators for the energy efficient are identified through literature review. Secondly, the artificial neural network (ANN)-based data-driven method is employed to cluster the values of indicators. Thirdly, the energy efficient of EI project is calculated through the fuzzy integrated cloud evaluation algorithm. Finally, the applicability of the proposed method is demonstrated by a case study.

Keywords: energy efficient, energy internet, data-driven, fuzzy integrated evaluation, cloud model

Procedia PDF Downloads 207
24974 Identification and Quantification of Acid Sites of M(X)X Zeolites (M= Cu2+ and/or Zn2+,X = Level of Exchange): An In situ FTIR Study Using Pyridine Adsorption/Desorption

Authors: H. Hammoudi, S. Bendenia, I. Batonneau-Gener, J. Comparot, K. Marouf-Khelifa, A. Khelifa

Abstract:

X zeolites were prepared by ion-exchange with Cu2+ and/or Zn2+ cations, at different concentrations of the exchange solution, and characterised by thermal analysis and nitrogen adsorption. The acidity of the samples was investigated by pyridine adsorption–desorption followed by in situ Fourier transform infrared (FTIR) spectroscopy. Desorption was carried out at 150, 250 and 350 °C. The objective is to estimate the nature and concentration of acid sites. A comparison between the binary (Cu(x)X, Zn(x)X) and ternary (CuZn(x)X) exchanges was also established (x = level of exchange) through the Cu(43)X, Zn(48)X and CuZn(50)X samples. Lewis acidity decreases overall with desorption temperature and the level of exchange. As the latter increases, there is a conversion of some Lewis sites into those of Brønsted during thermal treatment. In return, the concentration of Brønsted sites increases with the degree of exchange. The Brønsted acidity of CuZn(50)X at 350 °C is more important than the sum of those of Cu(43)X and Zn(48)X. The found values were 73, 32 and 15 μmol g-1, respectively. Besides, the concentration of Brønsted sites for CuZn(50)X increases with desorption temperature. These features indicate the presence of a synergistic effect amplifying the strength of these sites when Cu2+ and Zn2+ cations compete for the occupancy of sites distributed inside zeolitic cavities.

Keywords: acidity, adsorption, pyridine, zeolites

Procedia PDF Downloads 230
24973 Synthesis and Characterization of AFe₂O₄ (A=CA, Co, CU) Nano-Spinels: Application to Hydrogen Photochemical Production under Visible Light Irradiation

Authors: H. Medjadji, A. Boulahouache, N. Salhi, A. Boudjemaa, M. Trari

Abstract:

Hydrogen from renewable sources, such as solar, is referred to as green hydrogen. The splitting water process using semiconductors, such as photocatalysts, has attracted significant attention due to its potential application for solving the energy crisis and environmental pollution. Spinel ferrites of the MF₂O₄ type have shown broad interest in diverse energy conversion processes, including fuel cells and photo electrocatalytic water splitting. This work focuses on preparing nano-spinels based on iron AFe₂O₄ (A= Ca, Co, and Cu) as photocatalysts using the nitrate method. These materials were characterized both physically and optically and subsequently tested for hydrogen generation under visible light irradiation. Various techniques were used to investigate the properties of the materials, including TGA-DT, X-ray diffraction (XRD), Fourier Transform Infrared Spectroscopy (FTIR), UV-visible spectroscopy, Scanning Electron Microscopy with Energy Dispersive X-ray Spectroscopy (SEM-EDX) and X-ray Photoelectron Spectroscopy (XPS) was also undertaken. XRD analysis confirmed the formation of pure phases at 850°C, with crystalline sizes of 31 nm for CaFe₂O₄, 27 nm for CoFe₂O₄, and 40 nm for CuFe₂O₄. The energy gaps, calculated from recorded diffuse reflection data, are 1.85 eV for CaFe₂O₄, 1.27 eV for CoFe₂O₄, and 1.64 eV for CuFe₂O₄. SEM micrographs showed homogeneous grains with uniform shapes and medium porosity in all samples. EDX elemental analysis determined the absence of any contaminating elements, highlighting the high purity of the prepared materials via the nitrate route. XPS spectra revealed the presence of Fe3+ and O in all samples. Additionally, XPS analysis revealed the presence of Ca²⁺, Co²⁺, and Cu²⁺ on the surface of CaFe₂O₄ and CoFe₂O₄ spinels, respectively. The photocatalytic activity was successfully evaluated by measuring H₂ evolution through the water-splitting process. The best performance was achieved with CaFe₂O₄ in a neutral medium (pH ~ 7), yielding 189 µmol at an optimal temperature of ~50°C. The highest hydrogen production rates for CoFe₂O₄ and CuFe₂O₄ were obtained at pH ~ 12 with release rates of 65 and 85 µmol, respectively, under visible light irradiation at the same optimal temperature. Various conditions were investigated including the pH of the solution, the hole sensors utilization and recyclability.

Keywords: hydrogen, MFe₂O₄, nitrate route, spinel ferrite

Procedia PDF Downloads 44