Search results for: 2d and 3d data conversion
25890 MBES-CARIS Data Validation for the Bathymetric Mapping of Shallow Water in the Kingdom of Bahrain on the Arabian Gulf
Authors: Abderrazak Bannari, Ghadeer Kadhem
Abstract:
The objectives of this paper are the validation and the evaluation of MBES-CARIS BASE surface data performance for bathymetric mapping of shallow water in the Kingdom of Bahrain. The latter is an archipelago with a total land area of about 765.30 km², approximately 126 km of coastline and 8,000 km² of marine area, located in the Arabian Gulf, east of Saudi Arabia and west of Qatar (26° 00’ N, 50° 33’ E). To achieve our objectives, bathymetric attributed grid files (X, Y, and depth) generated from the coverage of ship-track MBSE data with 300 x 300 m cells, processed with CARIS-HIPS, were downloaded from the General Bathymetric Chart of the Oceans (GEBCO). Then, brought into ArcGIS and converted into a raster format following five steps: Exportation of GEBCO BASE surface data to the ASCII file; conversion of ASCII file to a points shape file; extraction of the area points covering the water boundary of the Kingdom of Bahrain and multiplying the depth values by -1 to get the negative values. Then, the simple Kriging method was used in ArcMap environment to generate a new raster bathymetric grid surface of 30×30 m cells, which was the basis of the subsequent analysis. Finally, for validation purposes, 2200 bathymetric points were extracted from a medium scale nautical map (1:100 000) considering different depths over the Bahrain national water boundary. The nautical map was scanned, georeferenced and overlaid on the MBES-CARIS generated raster bathymetric grid surface (step 5 above), and then homologous depth points were selected. Statistical analysis, expressed as a linear error at the 95% confidence level, showed a strong correlation coefficient (R² = 0.96) and a low RMSE (± 0.57 m) between the nautical map and derived MBSE-CARIS depths if we consider only the shallow areas with depths of less than 10 m (about 800 validation points). When we consider only deeper areas (> 10 m) the correlation coefficient is equal to 0.73 and the RMSE is equal to ± 2.43 m while if we consider the totality of 2200 validation points including all depths, the correlation coefficient is still significant (R² = 0.81) with satisfactory RMSE (± 1.57 m). Certainly, this significant variation can be caused by the MBSE that did not completely cover the bottom in several of the deeper pockmarks because of the rapid change in depth. In addition, steep slopes and the rough seafloor probably affect the acquired MBSE raw data. In addition, the interpolation of missed area values between MBSE acquisition swaths-lines (ship-tracked sounding data) may not reflect the true depths of these missed areas. However, globally the results of the MBES-CARIS data are very appropriate for bathymetric mapping of shallow water areas.Keywords: bathymetry mapping, multibeam echosounder systems, CARIS-HIPS, shallow water
Procedia PDF Downloads 38325889 Algorithms used in Spatial Data Mining GIS
Authors: Vahid Bairami Rad
Abstract:
Extracting knowledge from spatial data like GIS data is important to reduce the data and extract information. Therefore, the development of new techniques and tools that support the human in transforming data into useful knowledge has been the focus of the relatively new and interdisciplinary research area ‘knowledge discovery in databases’. Thus, we introduce a set of database primitives or basic operations for spatial data mining which are sufficient to express most of the spatial data mining algorithms from the literature. This approach has several advantages. Similar to the relational standard language SQL, the use of standard primitives will speed-up the development of new data mining algorithms and will also make them more portable. We introduced a database-oriented framework for spatial data mining which is based on the concepts of neighborhood graphs and paths. A small set of basic operations on these graphs and paths were defined as database primitives for spatial data mining. Furthermore, techniques to efficiently support the database primitives by a commercial DBMS were presented.Keywords: spatial data base, knowledge discovery database, data mining, spatial relationship, predictive data mining
Procedia PDF Downloads 46625888 Data Stream Association Rule Mining with Cloud Computing
Authors: B. Suraj Aravind, M. H. M. Krishna Prasad
Abstract:
There exist emerging applications of data streams that require association rule mining, such as network traffic monitoring, web click streams analysis, sensor data, data from satellites etc. Data streams typically arrive continuously in high speed with huge amount and changing data distribution. This raises new issues that need to be considered when developing association rule mining techniques for stream data. This paper proposes to introduce an improved data stream association rule mining algorithm by eliminating the limitation of resources. For this, the concept of cloud computing is used. Inclusion of this may lead to additional unknown problems which needs further research.Keywords: data stream, association rule mining, cloud computing, frequent itemsets
Procedia PDF Downloads 50525887 Transformation of Industrial Policy towards Industry 4.0 and Its Impact on Firms' Competition
Authors: Arūnas Burinskas
Abstract:
Although Europe is on the threshold of a new industrial revolution called Industry 4.0, many believe that this will increase the flexibility of production, the mass adaptation of products to consumers and the speed of their service; it will also improve product quality and dramatically increase productivity. However, as expected, all the benefits of Industry 4.0 face many of the inevitable changes and challenges they pose. One of them is the inevitable transformation of current competition and business models. This article examines the possible results of competitive conversion from the classic Bertrand and Cournot models to qualitatively new competition based on innovation. Ability to deliver a new product quickly and the possibility to produce the individual design (through flexible and quickly configurable factories) by reducing equipment failures and increasing process automation and control is highly important. This study shows that the ongoing transformation of the competition model is changing the game. This, together with the creation of complex value networks, means huge investments that make it particularly difficult for small and medium-sized enterprises. In addition, the ongoing digitalization of data raises new concerns regarding legal obligations, intellectual property, and security.Keywords: Bertrand and Cournot Competition, competition model, industry 4.0, industrial organisation, monopolistic competition
Procedia PDF Downloads 14425886 A Comprehensive Survey and Improvement to Existing Privacy Preserving Data Mining Techniques
Authors: Tosin Ige
Abstract:
Ethics must be a condition of the world, like logic. (Ludwig Wittgenstein, 1889-1951). As important as data mining is, it possess a significant threat to ethics, privacy, and legality, since data mining makes it difficult for an individual or consumer (in the case of a company) to control the accessibility and usage of his data. This research focuses on Current issues and the latest research and development on Privacy preserving data mining methods as at year 2022. It also discusses some advances in those techniques while at the same time highlighting and providing a new technique as a solution to an existing technique of privacy preserving data mining methods. This paper also bridges the wide gap between Data mining and the Web Application Programing Interface (web API), where research is urgently needed for an added layer of security in data mining while at the same time introducing a seamless and more efficient way of data mining.Keywords: data, privacy, data mining, association rule, privacy preserving, mining technique
Procedia PDF Downloads 17825885 Big Data: Concepts, Technologies and Applications in the Public Sector
Authors: A. Alexandru, C. A. Alexandru, D. Coardos, E. Tudora
Abstract:
Big Data (BD) is associated with a new generation of technologies and architectures which can harness the value of extremely large volumes of very varied data through real time processing and analysis. It involves changes in (1) data types, (2) accumulation speed, and (3) data volume. This paper presents the main concepts related to the BD paradigm, and introduces architectures and technologies for BD and BD sets. The integration of BD with the Hadoop Framework is also underlined. BD has attracted a lot of attention in the public sector due to the newly emerging technologies that allow the availability of network access. The volume of different types of data has exponentially increased. Some applications of BD in the public sector in Romania are briefly presented.Keywords: big data, big data analytics, Hadoop, cloud
Procedia PDF Downloads 31425884 Semantic Data Schema Recognition
Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia
Abstract:
The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns
Procedia PDF Downloads 41925883 Impact of Private Oil Palm Expansion on Indonesia Tropical Forest Deforestation Rate: Case Study in the Province of Riau
Authors: Arzyana Sunkar, Yanto Santosa, Intan Purnamasari, Yohanna Dalimunthe
Abstract:
A variety of negative allegations have criticized the Indonesian oil palm plantations as being environmentally unfriendly. One of the important allegations thus must be verified is that expansion of Indonesian oil palm plantation has increased the deforestation rate of primary tropical forest. In relation to this, a research was conducted to study the origin or history of the status and land use of 8 private oil palm plantations (with a total of 46,372.38 ha) located in Riau Province. Several methods were employed: (1) conducting analysis of overlay maps between oil palm plantation studied with the 1986 Forest Map Governance Agreement (TGHK) and the 1994 and 2014 Riau Provincial Spatial Plans(RTRWP); (2) studying the Cultivation Right on Land (HGU) documents including the Forestry Ministerial Decree on the release of forest area and (3) interpretation of lands at imagery of bands 542, covering 3 years before and after the oil palm industries operated. In addition, field cross-checked, and interviews were conducted with National Land Agency, Plantation and Forestry Office and community figures. The results indicated that as much as 1.95% of the oil palm plantations under study were converted from production forest, 30.34% from limited production forest and 67.70% from area for other usage /conversion production forest. One year prior to the establishment of the plantations, the land cover types comprised of rubber plantations (49.96%), secondary forest (35.99%), bare land (10.17%), shrubs (3.03%) and mixed dryland farming-shrubs (0.84%), whereas the land use types comprised of 35.99% forest concession areas, 14.04% migrants dryland farms, and 49.96% Cultivation Right on Land of other companies. These results indicated that most of the private oil palm plantations under study, resulted from the conversion of production forests and the previous land use were not primary forest but rubber plantations and secondary forests.Keywords: land cover types, land use history, primary forest, private oil palm plantations
Procedia PDF Downloads 24325882 Access Control System for Big Data Application
Authors: Winfred Okoe Addy, Jean Jacques Dominique Beraud
Abstract:
Access control systems (ACs) are some of the most important components in safety areas. Inaccuracies of regulatory frameworks make personal policies and remedies more appropriate than standard models or protocols. This problem is exacerbated by the increasing complexity of software, such as integrated Big Data (BD) software for controlling large volumes of encrypted data and resources embedded in a dedicated BD production system. This paper proposes a general access control strategy system for the diffusion of Big Data domains since it is crucial to secure the data provided to data consumers (DC). We presented a general access control circulation strategy for the Big Data domain by describing the benefit of using designated access control for BD units and performance and taking into consideration the need for BD and AC system. We then presented a generic of Big Data access control system to improve the dissemination of Big Data.Keywords: access control, security, Big Data, domain
Procedia PDF Downloads 13925881 Performance of Growing Rahaji Bulls Fed Diets Containing Similar Concentrates and Different Crop Residues in a Semi-Arid Environment
Authors: Husaini Sama
Abstract:
The study was conducted, in a 120 - day’s trial, to monitor the performance of growing Rahaji bulls fed different crop residues. There were four experimental treatments, each containing three (3) bull-calves. The first three (experimental) diets were prepared with rice straw, millet stalks and a combination of the two in equal proportions. These 3 diets were supplemented with concentrates. Treatments 1, 2 and 3 consisted of rice straw, millet stalk and combination of rice straw and millet stalk in equal ratio, respectively as basal feeds, while, Treatment 4 (containing standard diet of cow pea haulms, rice straw and wheat offal) served as control to compare with the other treatments. Data on feed intake and livability was collected on daily basis and that of live weight gain and feed conversion ratio were collected fortnightly, but data on apparent nutrient retention trial was collected towards the end of the experiment. Water was offered ad libitum. Records obtained were subjected to statistical analysis using SPSS (1988) software package in accordance with a Completely Randomized Design (CRD). Results obtained indicated that feed intake was significantly higher (P<0.05) for calves on treatments 3 and 4 compared to those on treatments 1and 2. The study observed that it was cheaper to formulate diets 2 and 3 than the other 2 diets. The control diet (T4) was observed to be relatively more expensive than the other 3 formulated diets. It was concluded from the findings that, concentrate containing combination of rice straw and cereal stalks was economical and satisfactory for feeding growing Rahaji bulls in this ecological zone (Semi-arid environment).Keywords: rahaji bulls, crop residues, concentrates, semi-arid environment
Procedia PDF Downloads 18925880 Modelling Fluidization by Data-Based Recurrence Computational Fluid Dynamics
Authors: Varun Dongre, Stefan Pirker, Stefan Heinrich
Abstract:
Over the last decades, the numerical modelling of fluidized bed processes has become feasible even for industrial processes. Commonly, continuous two-fluid models are applied to describe large-scale fluidization. In order to allow for coarse grids novel two-fluid models account for unresolved sub-grid heterogeneities. However, computational efforts remain high – in the order of several hours of compute-time for a few seconds of real-time – thus preventing the representation of long-term phenomena such as heating or particle conversion processes. In order to overcome this limitation, data-based recurrence computational fluid dynamics (rCFD) has been put forward in recent years. rCFD can be regarded as a data-based method that relies on the numerical predictions of a conventional short-term simulation. This data is stored in a database and then used by rCFD to efficiently time-extrapolate the flow behavior in high spatial resolution. This study will compare the numerical predictions of rCFD simulations with those of corresponding full CFD reference simulations for lab-scale and pilot-scale fluidized beds. In assessing the predictive capabilities of rCFD simulations, we focus on solid mixing and secondary gas holdup. We observed that predictions made by rCFD simulations are highly sensitive to numerical parameters such as diffusivity associated with face swaps. We achieved a computational speed-up of four orders of magnitude (10,000 time faster than classical TFM simulation) eventually allowing for real-time simulations of fluidized beds. In the next step, we apply the checkerboarding technique by introducing gas tracers subjected to convection and diffusion. We then analyze the concentration profiles by observing mixing, transport of gas tracers, insights about the convective and diffusive pattern of the gas tracers, and further towards heat and mass transfer methods. Finally, we run rCFD simulations and calibrate them with numerical and physical parameters compared with convectional Two-fluid model (full CFD) simulation. As a result, this study gives a clear indication of the applicability, predictive capabilities, and existing limitations of rCFD in the realm of fluidization modelling.Keywords: multiphase flow, recurrence CFD, two-fluid model, industrial processes
Procedia PDF Downloads 8025879 A Data Envelopment Analysis Model in a Multi-Objective Optimization with Fuzzy Environment
Authors: Michael Gidey Gebru
Abstract:
Most of Data Envelopment Analysis models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp Data Envelopment Analysis into Data Envelopment Analysis with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the Data Envelopment Analysis model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units' efficiency. Finally, the developed Data Envelopment Analysis model is illustrated with an application on real data 50 educational institutions.Keywords: efficiency, Data Envelopment Analysis, fuzzy, higher education, input, output
Procedia PDF Downloads 6825878 Creative Mapping Landuse and Human Activities: From the Inventories of Factories to the History of the City and Citizens
Authors: R. Tamborrino, F. Rinaudo
Abstract:
Digital technologies offer possibilities to effectively convert historical archives into instruments of knowledge able to provide a guide for the interpretation of historical phenomena. Digital conversion and management of those documents allow the possibility to add other sources in a unique and coherent model that permits the intersection of different data able to open new interpretations and understandings. Urban history uses, among other sources, the inventories that register human activities in a specific space (e.g. cadastres, censuses, etc.). The geographic localisation of that information inside cartographic supports allows for the comprehension and visualisation of specific relationships between different historical realities registering both the urban space and the peoples living there. These links that merge the different nature of data and documentation through a new organisation of the information can suggest a new interpretation of other related events. In all these kinds of analysis, the use of GIS platforms today represents the most appropriate answer. The design of the related databases is the key to realise the ad-hoc instrument to facilitate the analysis and the intersection of data of different origins. Moreover, GIS has become the digital platform where it is possible to add other kinds of data visualisation. This research deals with the industrial development of Turin at the beginning of the 20th century. A census of factories realized just prior to WWI provides the opportunity to test the potentialities of GIS platforms for the analysis of urban landscape modifications during the first industrial development of the town. The inventory includes data about location, activities, and people. GIS is shaped in a creative way linking different sources and digital systems aiming to create a new type of platform conceived as an interface integrating different kinds of data visualisation. The data processing allows linking this information to an urban space, and also visualising the growth of the city at that time. The sources, related to the urban landscape development in that period, are of a different nature. The emerging necessity to build, enlarge, modify and join different buildings to boost the industrial activities, according to their fast development, is recorded by different official permissions delivered by the municipality and now stored in the Historical Archive of the Municipality of Turin. Those documents, which are reports and drawings, contain numerous data on the buildings themselves, including the block where the plot is located, the district, and the people involved such as the owner, the investor, and the engineer or architect designing the industrial building. All these collected data offer the possibility to firstly re-build the process of change of the urban landscape by using GIS and 3D modelling technologies thanks to the access to the drawings (2D plans, sections and elevations) that show the previous and the planned situation. Furthermore, they access information for different queries of the linked dataset that could be useful for different research and targets such as economics, biographical, architectural, or demographical. By superimposing a layer of the present city, the past meets to the present-industrial heritage, and people meet urban history.Keywords: digital urban history, census, digitalisation, GIS, modelling, digital humanities
Procedia PDF Downloads 19125877 Characterization of the Queuine Salvage Pathway From Bacteria in the Human Parasite Entamoeba Histolytica
Authors: Lotem Sarid, Meirav Trebicz-Geffen, Serge Ankri
Abstract:
Queuosine (Q) is a naturally occurring modified nucleoside that occurs in the first position of transfer RNA anticodons such as Asp, Asn, His, and Tyr. As eukaryotes lack pathways to synthesize queuine, the nucleobase of queuosine, they must obtain it from their diet or gut microbiota. Our previous work investigated the effects of queuine on the physiology of the eukaryotic parasite Entamoeba histolytica and defined the enzyme EhTGT responsible for its incorporation into tRNA. To our best knowledge, it is unknown how E. histolytica salvages Q from gut bacteria. We used N-acryloyl-3-aminophenylboronic acid (APB) PAGE analysis to demonstrate that E. histolytica trophozoites can salvage queuine from Q or E. coli K12 but not from the modified E. coli QueC strain, which cannot produce queuine. Next, we examined the role of EhDUF2419, a protein with homology to DNA glycosylase, as a queuine salvage enzyme in E. histolytica. When EhDUF2419 expression is silenced, it inhibits Q's conversion to queuine, resulting in a decrease in Q-tRNA levels. We also observed that Q protects control trophozoites from oxidative stress (OS), but not siEhDUF2419 trophozoites. Overall, our data reveal that EhDUF2419 is central for the salvaging of queuine from bacteria and for the resistance of the parasite to OS.Keywords: entamoeba histolytica, epitranscriptomics, gut microbiota, queuine, queuosine, response to oxidative stress, tRNA modification.
Procedia PDF Downloads 12825876 Defining New Limits in Hybrid Perovskites: Single-Crystal Solar Cells with Exceptional Electron Diffusion Length Reaching Half Millimeters
Authors: Bekir Turedi
Abstract:
Exploiting the potential of perovskite single-crystal solar cells in optoelectronic applications necessitates overcoming a significant challenge: the low charge collection efficiency at increased thickness, which has restricted their deployment in radiation detectors and nuclear batteries. Our research details a promising approach to this problem, wherein we have successfully fabricated single-crystal MAPbI3 solar cells employing a space-limited inverse temperature crystallization (ITC) methodology. Remarkably, these cells, up to 400-fold thicker than current-generation perovskite polycrystalline films, maintain a high charge collection efficiency even without external bias. The crux of this achievement lies in the long electron diffusion length within these cells, estimated to be around 0.45 mm. This extended diffusion length ensures the conservation of high charge collection and power conversion efficiencies, even as the thickness of the cells increases. Fabricated cells at 110, 214, and 290 µm thickness manifested power conversion efficiencies (PCEs) of 20.0, 18.4, and 14.7% respectively. The single crystals demonstrated nearly optimal charge collection, even when their thickness exceeded 200 µm. Devices of thickness 108, 214, and 290 µm maintained 98.6, 94.3, and 80.4% of charge collection efficiency relative to their maximum theoretical short-circuit current value, respectively. Additionally, we have proposed an innovative, self-consistent technique for ascertaining the electron-diffusion length in perovskite single crystals under operational conditions. The computed electron-diffusion length approximated 446 µm, significantly surpassing previously reported values for this material. In conclusion, our findings underscore the feasibility of fabricating halide perovskite single-crystal solar cells of hundreds of micrometers in thickness while preserving high charge extraction efficiency and PCE. This advancement paves the way for developing perovskite-based optoelectronics necessitating thicker active layers, such as X-ray detectors and nuclear batteries.Keywords: perovskite, solar cell, single crystal, diffusion length
Procedia PDF Downloads 5625875 Solving a Micromouse Maze Using an Ant-Inspired Algorithm
Authors: Rolando Barradas, Salviano Soares, António Valente, José Alberto Lencastre, Paulo Oliveira
Abstract:
This article reviews the Ant Colony Optimization, a nature-inspired algorithm, and its implementation in the Scratch/m-Block programming environment. The Ant Colony Optimization is a part of Swarm Intelligence-based algorithms and is a subset of biological-inspired algorithms. Starting with a problem in which one has a maze and needs to find its path to the center and return to the starting position. This is similar to an ant looking for a path to a food source and returning to its nest. Starting with the implementation of a simple wall follower simulator, the proposed solution uses a dynamic graphical interface that allows young students to observe the ants’ movement while the algorithm optimizes the routes to the maze’s center. Things like interface usability, Data structures, and the conversion of algorithmic language to Scratch syntax were some of the details addressed during this implementation. This gives young students an easier way to understand the computational concepts of sequences, loops, parallelism, data, events, and conditionals, as they are used through all the implemented algorithms. Future work includes the simulation results with real contest mazes and two different pheromone update methods and the comparison with the optimized results of the winners of each one of the editions of the contest. It will also include the creation of a Digital Twin relating the virtual simulator with a real micromouse in a full-size maze. The first test results show that the algorithm found the same optimized solutions that were found by the winners of each one of the editions of the Micromouse contest making this a good solution for maze pathfinding.Keywords: nature inspired algorithms, scratch, micromouse, problem-solving, computational thinking
Procedia PDF Downloads 12825874 Using Urban Conversion to Green Public Space as a Tool to Generate Urban Change: Case of Seoul
Authors: Rachida Benabbou, Sang Hun Park, Hee Chung Lee
Abstract:
The world’s population is increasing with unprecedented speed, leading to fast growing urbanization pace. Cities since the Industrial revolution had evolved to fit the growing demand on infrastructure, roads, transportation, and housing. Through this evolution, cities had grown into grey, polluted, and vehicle-oriented urban areas with a significant lack of green spaces. Consequently, we ended up with low quality of life for citizens. Therefore, many cities, nowadays, are revising the way we think urbanism and try to grow into more livable and citizen-friendly, by creating change from the inside out. Thus, cities are trying to bring back nature in its crowded grey centers and regenerate many urban areas as green public spaces not only as a way to give new breath to the city, but also as a way to create change either in the environmental, social and economic levels. The city of Seoul is one of the fast growing global cities. Its population is over 12 million and it is expected to continue to grow to a point where the quality of life may seriously deteriorate. As most green areas in Seoul are located in the suburbs in form of mountains, the city’s urban areas suffer from lack of accessible green spaces in a walking distance. Understanding the gravity and consequences of this issue, Seoul city is undergoing major changes. Many of its projects are oriented to be green public spaces where citizens can enjoy the public life in healthy outdoors. The aim of this paper is to explore the results of urban conversions into green public spaces. Starting with different locations, nature, size, and scale, these conversions can lead to significant change in the surrounding areas, thus can be used as an efficient tool of regeneration for urban areas. Through a comparative analysis of three different types of urban conversions projects in the city of Seoul, we try to show the positive urban influence of the outcomes, in order to encourage cities to use green spaces as a strategic tool for urban regeneration and redevelopment.Keywords: urban conversion, green public space, change, urban regeneration
Procedia PDF Downloads 30925873 The Co-Simulation Interface SystemC/Matlab Applied in JPEG and SDR Application
Authors: Walid Hassairi, Moncef Bousselmi, Mohamed Abid
Abstract:
Functional verification is a major part of today’s system design task. Several approaches are available for verification on a high abstraction level, where designs are often modeled using MATLAB/Simulink. However, different approaches are a barrier to a unified verification flow. In this paper, we propose a co-simulation interface between SystemC and MATLAB and Simulink to enable functional verification of multi-abstraction levels designs. The resulting verification flow is tested on JPEG compression algorithm. The required synchronization of both simulation environments, as well as data type conversion is solved using the proposed co-simulation flow. We divided into two encoder jpeg parts. First implemented in SystemC which is the DCT is representing the HW part. Second, consisted of quantization and entropy encoding which is implemented in Matlab is the SW part. For communication and synchronization between these two parts we use S-Function and engine in Simulink matlab. With this research premise, this study introduces a new implementation of a Hardware SystemC of DCT. We compare the result of our simulation compared to SW / SW. We observe a reduction in simulation time you have 88.15% in JPEG and the design efficiency of the supply design is 90% in SDR.Keywords: hardware/software, co-design, co-simulation, systemc, matlab, s-function, communication, synchronization
Procedia PDF Downloads 41125872 Effect of Feeding Camel Rumen Content on Growth Performance and Haematological Parameters of Broiler Chickens under Semi-Arid Condition
Authors: Alhaji Musa Abdullahi, Usman Abdullahi, Adamu Adamu, Aminu Maidala
Abstract:
One hundred and fifty (150) day old chicks were randomly allocated into five dietary treatments birds and each treatment where replicated twice in groups of fifteen birds in each replicate. Camel rumen content (CRC) was included in the diets of broiler at 0, 5, 10, 15, and 20% to replace maize and groundnut cake to evaluate the effect on the performance and hematological parameters at the starter and finisher phase. A completely randomized design was used and 600g of feed was given daily and water was given ad libitum. At the starter phase, the daily weight gain and feed conversion ratio were significantly affected by the test ingredients, although T1(0% CRC) which serve as a control, were similar with T2(5% CRC), T3(10% CRC), and T4(15% CRC), while the lowest value was recorded in T5(20% CRC). The result indicates that up to 15% (CRC) level can be included in the starter diet to replace maize and groundnut cake without any effect on the performance. However, at the finisher phase, the daily feed intake, daily weight gain and feed conversion ratio show no significant (F>0.05) difference among the dietary treatments. Similarly, Packed cell volume (PCV), Red Blood Cell (RBC), White Blood Cell (WBC), Mean Corpuscular Volume (MCV), and Mean Corpuscular Haemoglobin (MCH) also did not differ significantly (F>0.05) among the dietary treatments while hemoglobin (Hb) and Mean Corpuscular Haemoglobin Concentration (MCHC) differs significantly. The differential counts of eosinophils, heterophils, and lymphocytes differ significantly among the treatment groups, while that of basophils and monocytes shows no significant difference among the treatment groups. This means up to 20% CRC inclusion level can be used to replaced maize and groundnut cake in the finisher diet without any adverse effect on the performance and hematological parameters of the chickens.Keywords: camel, rumen content, growth, hematology
Procedia PDF Downloads 22125871 Liquid Tin(II) Alkoxide Initiators for Use in the Ring-Opening Polymerisation of Cyclic Ester Monomers
Authors: Sujitra Ruengdechawiwat, Robert Molloy, Jintana Siripitayananon, Runglawan Somsunan, Paul D. Topham, Brian J. Tighe
Abstract:
The main aim of this research has been to design and synthesize some completely soluble liquid tin(II) alkoxide initiators for use in the ring-opening polymerisation (ROP) of cyclic ester monomers. This is in contrast to conventional tin(II) alkoxides in solid form which tend to be molecular aggregates and difficult to dissolve. The liquid initiators prepared were bis(tin(II) monooctoate) diethylene glycol ([Sn(Oct)]2DEG) and bis(tin(II) monooctoate) ethylene glycol ([Sn(Oct)]2EG). Their efficiencies as initiators in the bulk ROP of ε-caprolactone (CL) at 130oC were studied kinetically by dilatometry. Kinetic data over the 20-70% conversion range was used to construct both first-order and zero-order rate plots. It was found that the rate data fitted more closely to first-order kinetics with respect to the monomer concentration and gave higher first-order rate constants than the corresponding tin(II) octoate/diol initiating systems normally used to generate the tin(II) alkoxide in situ. Since the ultimate objective of this work is to produce copolymers suitable for biomedical use as absorbable monofilament surgical sutures, poly(L-lactide-co-ε-caprolactone) 75:25 mol %, P(LL-co-CL), copolymers were synthesized using both solid and liquid tin(II) alkoxide initiators at 130°C for 48 hrs. The statistical copolymers were obtained in near-quantitative yields with compositions (from 1H-NMR) close to the initial comonomer feed ratios. The monomer sequencing (from 13C-NMR) was partly random and partly blocky (gradient-type) due to the much differing monomer reactivity ratios (rLL >> rCL). From GPC, the copolymers obtained using the soluble liquid tin(II) alkoxides were found to have higher molecular weights (Mn = 40,000-100,000) than those from the only partially soluble solid initiators (Mn = 30,000-52,000).Keywords: biodegradable polyesters, poly(L-lactide-co-ε-caprolactone), ring-opening polymerisation, tin(II) alkoxide
Procedia PDF Downloads 19425870 The Economic Limitations of Defining Data Ownership Rights
Authors: Kacper Tomasz Kröber-Mulawa
Abstract:
This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.Keywords: antitrust, data, data ownership, digital economy, property rights
Procedia PDF Downloads 8825869 Protecting the Cloud Computing Data Through the Data Backups
Authors: Abdullah Alsaeed
Abstract:
Virtualized computing and cloud computing infrastructures are no longer fuzz or marketing term. They are a core reality in today’s corporate Information Technology (IT) organizations. Hence, developing an effective and efficient methodologies for data backup and data recovery is required more than any time. The purpose of data backup and recovery techniques are to assist the organizations to strategize the business continuity and disaster recovery approaches. In order to accomplish this strategic objective, a variety of mechanism were proposed in the recent years. This research paper will explore and examine the latest techniques and solutions to provide data backup and restoration for the cloud computing platforms.Keywords: data backup, data recovery, cloud computing, business continuity, disaster recovery, cost-effective, data encryption.
Procedia PDF Downloads 9425868 Missing Link Data Estimation with Recurrent Neural Network: An Application Using Speed Data of Daegu Metropolitan Area
Authors: JaeHwan Yang, Da-Woon Jeong, Seung-Young Kho, Dong-Kyu Kim
Abstract:
In terms of ITS, information on link characteristic is an essential factor for plan or operation. But in practical cases, not every link has installed sensors on it. The link that does not have data on it is called “Missing Link”. The purpose of this study is to impute data of these missing links. To get these data, this study applies the machine learning method. With the machine learning process, especially for the deep learning process, missing link data can be estimated from present link data. For deep learning process, this study uses “Recurrent Neural Network” to take time-series data of road. As input data, Dedicated Short-range Communications (DSRC) data of Dalgubul-daero of Daegu Metropolitan Area had been fed into the learning process. Neural Network structure has 17 links with present data as input, 2 hidden layers, for 1 missing link data. As a result, forecasted data of target link show about 94% of accuracy compared with actual data.Keywords: data estimation, link data, machine learning, road network
Procedia PDF Downloads 51125867 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies
Authors: Monica Lia
Abstract:
This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes
Procedia PDF Downloads 43725866 Fuels and Platform Chemicals Production from Lignocellulosic Biomass: Current Status and Future Prospects
Authors: Chandan Kundu, Sankar Bhattacharya
Abstract:
A significant disadvantage of fossil fuel energy production is the considerable amount of carbon dioxide (CO₂) released, which is one of the contributors to climate change. Apart from environmental concerns, changing fossil fuel prices have pushed society gradually towards renewable energy sources in recent years. Biomass is a plentiful and renewable resource and a source of carbon. Recent years have seen increased research interest in generating fuels and chemicals from biomass. Unlike fossil-based resources, biomass is composed of lignocellulosic material, which does not contribute to the increase in atmospheric CO₂ over a longer term. These considerations contribute to the current move of the chemical industry from non-renewable feedstock to renewable biomass. This presentation focuses on generating bio-oil and two major platform chemicals that can potentially improve the environment. Thermochemical processes such as pyrolysis are considered viable methods for producing bio-oil and biomass-based platform chemicals. Fluidized bed reactors, on the other hand, are known to boost bio-oil yields during pyrolysis due to their superior mixing and heat transfer features, as well as their scalability. This review and the associated experimental work are focused on the thermochemical conversion of biomass to bio-oil and two high-value platform chemicals, Levoglucosenone (LGO) and 5-Chloromethyl furfural (5-CMF), in a fluidized bed reactor. These two active molecules with distinct features can potentially be useful monomers in the chemical and pharmaceutical industries since they are well adapted to the manufacture of biologically active products. This process took several meticulous steps. To begin, the biomass was delignified using a peracetic acid pretreatment to remove lignin. Because of its complicated structure, biomass must be pretreated to remove the lignin, increasing access to the carbohydrate components and converting them to platform chemicals. The biomass was then characterized by Thermogravimetric analysis, Synchrotron-based THz spectroscopy, and in-situ DRIFTS in the laboratory. Based on the results, a continuous-feeding fluidized bed reactor system was constructed to generate platform chemicals from pretreated biomass using hydrogen chloride acid-gas as a catalyst. The procedure also yields biochar, which has a number of potential applications, including soil remediation, wastewater treatment, electrode production, and energy resource utilization. Consequently, this research also includes a preliminary experimental evaluation of the biochar's prospective applications. The biochar obtained was evaluated for its CO₂ and steam reactivity. The outline of the presentation will comprise the following: Biomass pretreatment for effective delignification Mechanistic study of the thermal and thermochemical conversion of biomass Thermochemical conversion of untreated and pretreated biomass in the presence of an acid catalyst to produce LGO and CMF A thermo-catalytic process for the production of LGO and 5-CMF in a continuously-fed fluidized bed reactor and efficient separation of chemicals Use of biochar generated from the platform chemicals production through gasificationKeywords: biomass, pretreatment, pyrolysis, levoglucosenone
Procedia PDF Downloads 14425865 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions
Authors: K. Hardy, A. Maurushat
Abstract:
Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.Keywords: big data, open data, productivity, data governance
Procedia PDF Downloads 37425864 Humins: From Industrial By-Product to High Value Polymers
Authors: Pierluigi Tosi, Ed de Jong, Gerard van Klink, Luc Vincent, Alice Mija
Abstract:
During the last decades renewable and low-cost resources have attracted increasingly interest. Carbohydrates can be derived by lignocellulosic biomasses, which is an attractive option since they represent the most abundant carbon source available in nature. Carbohydrates can be converted in a plethora of industrially relevant compounds, such as 5-hydroxymethylfurfural (HMF) and levulinic acid (LA), within acid catalyzed dehydration of sugars with mineral acids. Unfortunately, these acid catalyzed conversions suffer of the unavoidable formation of highly viscous heterogeneous poly-disperse carbon based materials known as humins. This black colored low value by-product is made by a complex mixture of macromolecules built by covalent random condensations of the several compounds present during the acid catalyzed conversion. Humins molecular structure is still under investigation but seems based on furanic rings network linked by aliphatic chains and decorated by several reactive moieties (ketones, aldehydes, hydroxyls, …). Despite decades of research, currently there is no way to avoid humins formation. The key parameter for enhance the economic viability of carbohydrate conversion processes is, therefore, increasing the economic value of the humins by-product. Herein are presented new humins based polymeric materials that can be prepared starting from the raw by-product by thermal treatment, without any step of purification or pretreatment. Humins foams can be produced with the control of reaction key parameters, obtaining polymeric porous materials with designed porosity, density, thermal and electrical conductivity, chemical and electrical stability, carbon amount and mechanical properties. Physico chemical properties can be enhanced by modifications on the starting raw material or adding different species during the polymerization. A comparisons on the properties of different compositions will be presented, along with tested applications. The authors gratefully acknowledge the European Community for financial support through Marie-Curie H2020-MSCA-ITN-2015 "HUGS" Project.Keywords: by-product, humins, polymers, valorization
Procedia PDF Downloads 14425863 A Review on Existing Challenges of Data Mining and Future Research Perspectives
Authors: Hema Bhardwaj, D. Srinivasa Rao
Abstract:
Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges
Procedia PDF Downloads 11425862 Polypyrrole Integrated MnCo2O4 Nanorods Hybrid as Electrode Material for High Performance Supercapacitor
Authors: Santimoy Khilari, Debabrata Pradhan
Abstract:
Ever−increasing energy demand and growing energy crisis along with environmental issues emphasize the research on sustainable energy conversion and storage systems. Recently, supercapacitors or electrochemical capacitors emerge as a promising energy storage technology for future generation. The activity of supercapacitors generally depends on the efficiency of its electrode materials. So, the development of cost−effective efficient electrode materials for supercapacitors is one of the challenges to the scientific community. Transition metal oxides with spinel crystal structure receive much attention for different electrochemical applications in energy storage/conversion devices because of their improved performance as compared to simple oxides. In the present study, we have synthesized polypyrrole (PPy) supported manganese cobaltite nanorods (MnCo2O4 NRs) hybrid electrode material for supercapacitor application. The MnCo2O4 NRs were synthesized by a simple hydrothermal and calcination approach. The MnCo2O4 NRs/PPy hybrid was prepared by in situ impregnation of MnCo2O4 NRs during polymerization of pyrrole. The surface morphology and microstructure of as−synthesized samples was characterized by scanning electron microscopy and transmission electron microscopy, respectively. The crystallographic phase of MnCo2O4 NRs, PPy and hybrid was determined by X-ray diffraction. Electrochemical charge storage activity of MnCo2O4 NRs, PPy and MnCo2O4 NRs/PPy hybrid was evaluated from cyclic voltammetry, chronopotentiometry and electrochemical impedance spectroscopy. Significant improvement of specific capacitance was achieved in MnCo2O4 NRs/PPy hybrid as compared to the individual components. Furthermore, the mechanically mixed MnCo2O4 NRs, and PPy shows lower specific capacitance as compared to MnCo2O4 NRs/PPy hybrid suggesting the importance of in situ hybrid preparation. The stability of as prepared electrode materials was tested by cyclic charge-discharge measurement for 1000 cycles. Maximum 94% capacitance was retained with MnCo2O4 NRs/PPy hybrid electrode. This study suggests that MnCo2O4 NRs/PPy hybrid can be used as a low cost electrode material for charge storage in supercapacitors.Keywords: supercapacitors, nanorods, spinel, MnCo2O4, polypyrrole
Procedia PDF Downloads 34325861 A Systematic Review on Challenges in Big Data Environment
Authors: Rimmy Yadav, Anmol Preet Kaur
Abstract:
Big Data has demonstrated the vast potential in streamlining, deciding, spotting business drifts in different fields, for example, producing, fund, Information Technology. This paper gives a multi-disciplinary diagram of the research issues in enormous information and its procedures, instruments, and system identified with the privacy, data storage management, network and energy utilization, adaptation to non-critical failure and information representations. Other than this, result difficulties and openings accessible in this Big Data platform have made.Keywords: big data, privacy, data management, network and energy consumption
Procedia PDF Downloads 314