Search results for: applications of big data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29410

Search results for: applications of big data

27580 A User-Friendly Approach for Design and Economic Analysis of Standalone PV System for the Electrification of Rural Area of Eritrea

Authors: Tedros Asefaw Gebremeskel, Xaoyi Yang

Abstract:

The potential of solar energy in Eritrea is relatively high, based on this truth, there are a number of isolated and remote villages situated far away from the electrical national grid which don’t get access to electricity. The core objective of this work is to design a most favorable and cost-effective power by means of standalone PV system for the electrification of a single housing in the inaccessible area of Eritrea. The sizing of the recommended PV system is achieved, such as radiation data and electrical load for the typical household of the selected site is also well thought-out in the design steps. Finally, the life cycle cost (LCC) analysis is conducted to evaluate the economic viability of the system. The outcome of the study promote the use of PV system for a residential building and show that PV system is a reasonable option to provide electricity for household applications in the rural area of Eritrea.

Keywords: electrification, inaccessible area, life cycle cost, residential building, stand-alone PV system

Procedia PDF Downloads 133
27579 A Case Study on Theme-Based Approach in Health Technology Engineering Education: Customer Oriented Software Applications

Authors: Mikael Soini, Kari Björn

Abstract:

Metropolia University of Applied Sciences (MUAS) Information and Communication Technology (ICT) Degree Programme provides full-time Bachelor-level undergraduate studies. ICT Degree Programme has seven different major options; this paper focuses on Health Technology. In Health Technology, a significant curriculum change in 2014 enabled transition from fragmented curriculum including dozens of courses to a new integrated curriculum built around three 30 ECTS themes. This paper focuses especially on the second theme called Customer Oriented Software Applications. From students’ point of view, the goal of this theme is to get familiar with existing health related ICT solutions and systems, understand business around health technology, recognize social and healthcare operating principles and services, and identify customers and users and their special needs and perspectives. This also acts as a background for health related web application development. Built web application is tested, developed and evaluated with real users utilizing versatile user centred development methods. This paper presents experiences obtained from the first implementation of Customer Oriented Software Applications theme. Student feedback was gathered with two questionnaires, one in the middle of the theme and other at the end of the theme. Questionnaires had qualitative and quantitative parts. Similar questionnaire was implemented in the first theme; this paper evaluates how the theme-based integrated curriculum has progressed in Health Technology major by comparing results between theme 1 and 2. In general, students were satisfied for the implementation, timing and synchronization of the courses, and the amount of work. However there is still room for development. Student feedback and teachers’ observations have been and will be used to develop the content and operating principles of the themes and whole curriculum.

Keywords: engineering education, integrated curriculum, learning and teaching methods, learning experience

Procedia PDF Downloads 314
27578 The Journey of a Malicious HTTP Request

Authors: M. Mansouri, P. Jaklitsch, E. Teiniker

Abstract:

SQL injection on web applications is a very popular kind of attack. There are mechanisms such as intrusion detection systems in order to detect this attack. These strategies often rely on techniques implemented at high layers of the application but do not consider the low level of system calls. The problem of only considering the high level perspective is that an attacker can circumvent the detection tools using certain techniques such as URL encoding. One technique currently used for detecting low-level attacks on privileged processes is the tracing of system calls. System calls act as a single gate to the Operating System (OS) kernel; they allow catching the critical data at an appropriate level of detail. Our basic assumption is that any type of application, be it a system service, utility program or Web application, “speaks” the language of system calls when having a conversation with the OS kernel. At this level we can see the actual attack while it is happening. We conduct an experiment in order to demonstrate the suitability of system call analysis for detecting SQL injection. We are able to detect the attack. Therefore we conclude that system calls are not only powerful in detecting low-level attacks but that they also enable us to detect high-level attacks such as SQL injection.

Keywords: Linux system calls, web attack detection, interception, SQL

Procedia PDF Downloads 351
27577 Using A Blockchain-Based, End-to-End Encrypted Communication System Between Mobile Terminals to Improve Organizational Privacy

Authors: Andrei Bogdan Stanescu, Robert Stana

Abstract:

Creating private and secure communication channels between employees has become a critical aspect in order to ensure organizational integrity and avoid leaks of sensitive information. With the widespread use of modern methods of disrupting communication between users, real use-cases of advanced encryption mechanisms have emerged to avoid cyber-attackers that are willing to intercept private conversations between critical employees in an organization. This paper aims to present a custom implementation of a messaging application named “Whisper” that uses end-to-end encryption (E2EE) mechanisms and blockchain-related components to protect sensitive conversations and mitigate the risks of information breaches inside organizations. The results of this research paper aim to expand the areas of applicability of E2EE algorithms and integrations with private blockchains in chat applications as a viable method of enhancing intra-organizational communication privacy.

Keywords: end-to-end encryption, mobile communication, cryptography, communication security, data privacy

Procedia PDF Downloads 82
27576 Synthesis and Characterization of Partially Oxidized Graphite Oxide for Solar Energy Storage Applications

Authors: Ghada Ben Hamad, Zohir Younsi, Fabien Salaun, Hassane Naji, Noureddine Lebaz

Abstract:

The graphene oxide (GO) material has attracted much attention for solar energy applications. This paper reports the synthesis and characterization of partially oxidized graphite oxide (GTO). GTO was obtained by modified Hummers method, which is based on the chemical oxidation of natural graphite. Several samples were prepared with different oxidation degree by an adjustment of the oxidizing agent’s amount. The effect of the oxidation degree on the chemical structure and on the morphology of GTO was determined by using Fourier transform infrared (FT-IR) spectroscopy, Energy Dispersive X-ray Spectroscopy (EDS), and scanning electronic microscope (SEM). The thermal stability of GTO was evaluated by using thermogravimetric analyzer (TGA) in Nitrogen atmosphere. The results indicate high degree oxidation of graphite oxide for each sample, proving that the process is efficient. The GTO synthesized by modified Hummers method shows promising characteristics. Graphene oxide (GO) obtained by exfoliation of GTO are recognized as a good candidate for thermal energy storage, and it will be used as solid shell material in the encapsulation of phase change materials (PCM).

Keywords: modified hummers method, graphite oxide, oxidation degree, solar energy storage

Procedia PDF Downloads 114
27575 A Study on How to Link BIM Services to Cloud Computing Architecture

Authors: Kim Young-Jin, Kim Byung-Kon

Abstract:

Although more efforts to expand the application of BIM (Building Information Modeling) technologies have be pursued in recent years than ever, it’s true that there have been various challenges in doing so, including a lack or absence of relevant institutions, lots of costs required to build BIM-related infrastructure, incompatible processes, etc. This, in turn, has led to a more prolonged delay in the expansion of their application than expected at an early stage. Especially, attempts to save costs for building BIM-related infrastructure and provide various BIM services compatible with domestic processes include studies to link between BIM and cloud computing technologies. Also in this study, the author attempted to develop a cloud BIM service operation model through analyzing the level of BIM applications for the construction sector and deriving relevant service areas, and find how to link BIM services to the cloud operation model, as through archiving BIM data and creating a revenue structure so that the BIM services may grow spontaneously, considering a demand for cloud resources.

Keywords: construction IT, BIM (building information modeling), cloud computing, BIM service based cloud computing

Procedia PDF Downloads 483
27574 Live Music Promotion in Burundi Country

Authors: Aster Anderson Rugamba

Abstract:

Context: Live music in Burundi is currently facing neglect and a decline in popularity, resulting in artists struggling to generate income from this field. Additionally, live music from Burundi has not been able to gain traction in the international market. It is essential to establish various structures and organizations to promote cultural events and support artistic endeavors in music and performing arts. Research Aim: The aim of this research is to seek new knowledge and understanding in the field of live music and its content in Burundi. Furthermore, it aims to connect with other professionals in the industry, make new discoveries, and explore potential collaborations and investments. Methodology: The research will utilize both quantitative and qualitative research methodologies. The quantitative approach will involve a sample size of 57 musician artists in Burundi. It will employ closed-ended questions and gather quantitative data to ensure a large sample size and high external validity. The qualitative approach will provide deeper insights and understanding through open-ended questions and in-depth interviews with selected participants. Findings: The research expects to find new theories, methodologies, empirical findings, and applications of existing knowledge that can contribute to the development of live music in Burundi. By exploring the challenges faced by artists and identifying potential solutions, the study aims to establish live music as a catalyst for development and generate a positive impact on both the Burundian and international community. Theoretical Importance: Theoretical contributions of this research will expand the current understanding of the live music industry in Burundi. It will propose new theories and models to address the issues faced by artists and highlight the potential of live music as a lucrative and influential industry. By bridging the gap between theory and practice, the research aims to provide valuable insights for academics, professionals, and policymakers. Data Collection and Analysis Procedures: Data will be collected through surveys, interviews, and archival research. Surveys will be administered to the sample of 57 musician artists, while interviews will be conducted to gain in-depth insights from selected participants. The collected data will be analyzed using both quantitative and qualitative methods, including statistical analysis and thematic analysis, respectively. This mixed-method approach will ensure a comprehensive and rigorous examination of the research questions addressed.

Keywords: business music in burundi, music in burundi, promotion of art, burundi music culture

Procedia PDF Downloads 56
27573 An Evaluation of the Impact of Epoxidized Neem Seed Azadirachta indica Oil on the Mechanical Properties of Polystyrene

Authors: Salihu Takuma

Abstract:

Neem seed oil has high contents of unsaturated fatty acids which can be converted to epoxy fatty acids. The vegetable oil – based epoxy material are sustainable, renewable and biodegradable materials replacing petrochemical – based epoxy materials in some applications. Polystyrene is highly brittle with limited mechanical applications. Raw neem seed oil was obtained from National Research Institute for Chemical Technology (NARICT), Zaria, Nigeria. The oil was epoxidized at 60 0C for three (3) hours using formic acid generated in situ. The epoxidized oil was characterized using Fourier Transform Infrared spectroscopy (FTIR). The disappearance of C = C stretching peak around 3011.7 cm-1and formation of a new absorption peak around 943 cm-1 indicate the success of epoxidation. The epoxidized oil was blended with pure polystyrene in different weight percent compositions using solution casting in chloroform. The tensile properties of the blends demonstrated that the addition of 5 wt % ENO to PS led to an increase in elongation at break, but a decrease in tensile strength and modulus. This is in accordance with the common rule that plasticizers can decrease the tensile strength of the polymer.

Keywords: biodegradable, elongation at break, epoxidation, epoxy fatty acids, sustainable, tensile strength and modulus

Procedia PDF Downloads 226
27572 Weed Out the Bad Seeds: The Impact of Strategic Portfolio Management on Patent Quality

Authors: A. Lefebre, M. Willekens, K. Debackere

Abstract:

Since the 1990s, patent applications have been booming, especially in the field of telecommunications. However, this increase in patent filings has been associated with an (alleged) decrease in patent quality. The plethora of low-quality patents devalues the high-quality ones, thus weakening the incentives for inventors to patent inventions. Despite the rich literature on strategic patenting, previous research has neglected to emphasize the importance of patent portfolio management and its impact on patent quality. In this paper, we compare related patent portfolios vs. nonrelated patents and investigate whether the patent quality and innovativeness differ between the two types. In the analyses, patent quality is proxied by five individual proxies (number of inventors, claims, renewal years, designated states, and grant lag), and these proxies are then aggregated into a quality index. Innovativeness is proxied by two measures: the originality and radicalness index. Results suggest that related patent portfolios have, on average, a lower patent quality compared to nonrelated patents, thus suggesting that firms use them for strategic purposes rather than for the extended protection they could offer. Even upon testing the individual proxies as a dependent variable, we find evidence that related patent portfolios are of lower quality compared to nonrelated patents, although not all results show significant coefficients. Furthermore, these proxies provide evidence of the importance of adding fixed effects to the model. Since prior research has found that these proxies are inherently flawed and never fully capture the concept of patent quality, we have chosen to run the analyses with individual proxies as supplementary analyses; however, we stick with the comprehensive index as our main model. This ensures that the results are not dependent upon one certain proxy but allows for multiple views of the concept. The presence of divisional applications might be linked to the level of innovativeness of the underlying invention. It could be the case that the parent application is so important that firms are going through the administrative burden of filing for divisional applications to ensure the protection of the invention and the preemption of competition. However, it could also be the case that the preempting is a result of divisional applications being used strategically as a backup plan and prolonging strategy, thus negatively impacting the innovation in the portfolio. Upon testing the level of novelty and innovation in the related patent portfolios by means of the originality and radicalness index, we find evidence for a significant negative association with related patent portfolios. The minimum innovation that has been brought on by the patents in the related patent portfolio is lower compared to the minimum innovation that can be found in nonrelated portfolios, providing evidence for the second argument.

Keywords: patent portfolio management, patent quality, related patent portfolios, strategic patenting

Procedia PDF Downloads 91
27571 Statistical Analysis for Overdispersed Medical Count Data

Authors: Y. N. Phang, E. F. Loh

Abstract:

Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling over-dispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling over-dispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling over-dispersed medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling over-dispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian, and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling over-dispersed medical count data when ZIP and ZINB are inadequate.

Keywords: zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit

Procedia PDF Downloads 532
27570 Use and Appreciation of a Type of Mathematics Textbook for Secondary Education

Authors: Verónica Díaz Quezada

Abstract:

Despite the wide variety of educational resources on the market and the advances produced in the technological field, the practice of teaching continues to be supported mainly by textbooks. This article reports on descriptive research with qualitative methodology carried out on secondary school mathematics teachers in a region of Chile, in order to describe the use and the indicators of appreciation that teachers have on the textbooks distributed by the official body to public educational establishments. Data were collected through an open response opinion questionnaire. According to the results, among the texts available for the annual performance of their teaching work, the expository and technological books predominate, to the detriment of comprehensive books. The exhibition structure favors master expositions and repetitive exercises, while, with the technological structure, a productive exercise is attempted, proposing numerous applications with the intention of giving meaning to the different mathematical rules and procedures. In relation to the indicators of appreciation that teachers have regarding the use of mathematics textbooks, the suitability and quality of the teaching resources are verified as the most satisfying characteristic.

Keywords: mathematics, secondary school, teachers, textbooks

Procedia PDF Downloads 157
27569 Monotone Rational Trigonometric Interpolation

Authors: Uzma Bashir, Jamaludin Md. Ali

Abstract:

This study is concerned with the visualization of monotone data using a piece-wise C1 rational trigonometric interpolating scheme. Four positive shape parameters are incorporated in the structure of rational trigonometric spline. Conditions on two of these parameters are derived to attain the monotonicity of monotone data and other two are left-free. Figures are used widely to exhibit that the proposed scheme produces graphically smooth monotone curves.

Keywords: trigonometric splines, monotone data, shape preserving, C1 monotone interpolant

Procedia PDF Downloads 263
27568 Teachers’ Incorporation of Emerging Communication Technologies in Higher Education in Kuwait

Authors: Bashaiar Alsanaa

Abstract:

Never has a revolution influenced all aspects of humanity as the communication revolution during the past two decades. This revolution, with all its advances and utilities, swept the world thus becoming an integral part of our lives, hence giving way to emerging applications at the social, economic, political, and educational levels. More specifically, such applications have changed the delivery system through which learning is acquired by students. Interaction with educators, accessibility to content, and creative delivery options are but a few facets of the new learning experience now being offered through the use of technology in the educational field. With different success rates, third world countries have tried to pace themselves with use of educational technology in advanced parts of the world. One such country is the small rich-oil state of Kuwait which has tried to adopt the e-educational model, however, an evaluation of such trial is yet to be done. This study aims to fill the void of research conducted around that topic. The study explores teachers’ acceptance of incorporating communication technologies in higher education in Kuwait. Teachers’ responses to survey questions present an overview of the e-learning experience in this country, and draw a framework through which implications and suggestions for future research can be discussed to better serve the advancement of e-education in developing countries.

Keywords: communication technologies, E-learning, Kuwait, social media

Procedia PDF Downloads 278
27567 Teachers Tolerance of Using Emerging Communication Technologies in Higher Education in Kuwait

Authors: Bashaiar Alsana

Abstract:

Never has a revolution influenced all aspects of humanity as the communication revolution during the past two decades. This revolution, with all its advances and utilities, swept the world thus becoming an integral part of our lives, hence giving way to emerging applications at the social, economic, political, and educational levels. More specifically, such applications have changed the delivery system through which learning is acquired by students. Interaction with educators, accessibility to content, and creative delivery options are but a few facets of the new learning experience now being offered through the use of technology in the educational field. With different success rates, third world countries have tried to pace themselves with use of educational technology in advanced parts of the world. One such country is the small rich-oil state of Kuwait which has tried to adopt the e-educational model, however, an evaluation of such trial is yet to be done. This study aims to fill the void of research conducted around that topic. The study explores teachers’ acceptance of incorporating communication technologies in higher education in Kuwait. Teachers’ responses to survey questions present an overview of the e-learning experience in this country, and draw a framework through which implications and suggestions for future research can be discussed to better serve the advancement of e-education in developing countries.

Keywords: communication technologies, e-learning, Kuwait, social media

Procedia PDF Downloads 254
27566 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels

Authors: Joshua Buli, David Pietrowski, Samuel Britton

Abstract:

Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.

Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization

Procedia PDF Downloads 69
27565 A Detailed Study of Two Different Airfoils on Flight Performance of MAV of Same Physical Dimension

Authors: Shoeb A. Adeel, Shashant Anand, Vivek Paul, Dinesh, Suraj, Roshan

Abstract:

The paper presents a study of micro air vehicles (MAVs) with wingspans of 20 Cm with two different airfoil configurations. MAVs have vast potential applications in both military and civilian areas. These MAVs are fully autonomous and supply real-time data. The paper focuses on two different designs of the MAVs one being N22 airfoil and the other a flat plate with similar dimension. As designed, the MAV would fly in a low Reynolds-number regime at airspeeds of 15 & 20 m/sec. Propulsion would be provided by an electric motor with an advanced lithium. Because of the close coupling between vehicle elements, system integration would be a significant challenge, requiring tight packaging and multifunction components to meet mass limitations and Centre of Gravity (C.G) balancing. These MAVs are feasible and within a couple of years of technology development in key areas including sensors, propulsion, Aerodynamics, and packaging these would be easily available to the users at affordable prices. The paper finally compares the flight performance of the two configurations.

Keywords: airfoil, CFD, MAV, flight performance, endurance, climb, lift, drag

Procedia PDF Downloads 486
27564 Key Performance Indicators and the Model for Achieving Digital Inclusion for Smart Cities

Authors: Khalid Obaed Mahmod, Mesut Cevik

Abstract:

The term smart city has appeared recently and was accompanied by many definitions and concepts, but as a simplified and clear definition, it can be said that the smart city is a geographical location that has gained efficiency and flexibility in providing public services to citizens through its use of technological and communication technologies, and this is what distinguishes it from other cities. Smart cities connect the various components of the city through the main and sub-networks in addition to a set of applications and thus be able to collect data that is the basis for providing technological solutions to manage resources and provide services. The basis of the work of the smart city is the use of artificial intelligence and the technology of the Internet of Things. The work presents the concept of smart cities, the pillars, standards, and evaluation indicators on which smart cities depend, and the reasons that prompted the world to move towards its establishment. It also provides a simplified hypothetical way to measure the ideal smart city model by defining some indicators and key pillars, simulating them with logic circuits, and testing them to determine if the city can be considered an ideal smart city or not.

Keywords: factors, indicators, logic gates, pillars, smart city

Procedia PDF Downloads 142
27563 Integration of Knowledge and Metadata for Complex Data Warehouses and Big Data

Authors: Jean Christian Ralaivao, Fabrice Razafindraibe, Hasina Rakotonirainy

Abstract:

This document constitutes a resumption of work carried out in the field of complex data warehouses (DW) relating to the management and formalization of knowledge and metadata. It offers a methodological approach for integrating two concepts, knowledge and metadata, within the framework of a complex DW architecture. The objective of the work considers the use of the technique of knowledge representation by description logics and the extension of Common Warehouse Metamodel (CWM) specifications. This will lead to a fallout in terms of the performance of a complex DW. Three essential aspects of this work are expected, including the representation of knowledge in description logics and the declination of this knowledge into consistent UML diagrams while respecting or extending the CWM specifications and using XML as pivot. The field of application is large but will be adapted to systems with heteroge-neous, complex and unstructured content and moreover requiring a great (re)use of knowledge such as medical data warehouses.

Keywords: data warehouse, description logics, integration, knowledge, metadata

Procedia PDF Downloads 131
27562 Data Analytics in Energy Management

Authors: Sanjivrao Katakam, Thanumoorthi I., Antony Gerald, Ratan Kulkarni, Shaju Nair

Abstract:

With increasing energy costs and its impact on the business, sustainability today has evolved from a social expectation to an economic imperative. Therefore, finding methods to reduce cost has become a critical directive for Industry leaders. Effective energy management is the only way to cut costs. However, Energy Management has been a challenge because it requires a change in old habits and legacy systems followed for decades. Today exorbitant levels of energy and operational data is being captured and stored by Industries, but they are unable to convert these structured and unstructured data sets into meaningful business intelligence. It must be noted that for quick decisions, organizations must learn to cope with large volumes of operational data in different formats. Energy analytics not only helps in extracting inferences from these data sets, but also is instrumental in transformation from old approaches of energy management to new. This in turn assists in effective decision making for implementation. It is the requirement of organizations to have an established corporate strategy for reducing operational costs through visibility and optimization of energy usage. Energy analytics play a key role in optimization of operations. The paper describes how today energy data analytics is extensively used in different scenarios like reducing operational costs, predicting energy demands, optimizing network efficiency, asset maintenance, improving customer insights and device data insights. The paper also highlights how analytics helps transform insights obtained from energy data into sustainable solutions. The paper utilizes data from an array of segments such as retail, transportation, and water sectors.

Keywords: energy analytics, energy management, operational data, business intelligence, optimization

Procedia PDF Downloads 359
27561 Detecting Heartbeat Architectural Tactic in Source Code Using Program Analysis

Authors: Ananta Kumar Das, Sujit Kumar Chakrabarti

Abstract:

Architectural tactics such as heartbeat, ping-echo, encapsulate, encrypt data are techniques that are used to achieve quality attributes of a system. Detecting architectural tactics has several benefits: it can aid system comprehension (e.g., legacy systems) and in the estimation of quality attributes such as safety, security, maintainability, etc. Architectural tactics are typically spread over the source code and are implicit. For large codebases, manual detection is often not feasible. Therefore, there is a need for automated methods of detection of architectural tactics. This paper presents a formalization of the heartbeat architectural tactic and a program analytic approach to detect this tactic in source code. The experiment of the proposed method is done on a set of Java applications. The outcome of the experiment strongly suggests that the method compares well with a manual approach in terms of its sensitivity and specificity, and far supersedes a manual exercise in terms of its scalability.

Keywords: software architecture, architectural tactics, detecting architectural tactics, program analysis, AST, alias analysis

Procedia PDF Downloads 149
27560 An Enzyme Technology - Metnin™ - Enables the Full Replacement of Fossil-Based Polymers by Lignin in Polymeric Composites

Authors: Joana Antunes, Thomas Levée, Barbara Radovani, Anu Suonpää, Paulina Saloranta, Liji Sobhana, Petri Ihalainen

Abstract:

Lignin is an important component in the exploitation of lignocellulosic biomass. It has been shown that within the next years, the yield of added-value lignin-based chemicals and materials will generate renewable alternatives to oil-based products (e.g. polymeric composites, resins and adhesives) and enhance the economic feasibility of biorefineries. In this paper, a novel technology for lignin valorisation (METNIN™) is presented. METNIN™ is based on the oxidative action of an alkaliphilic enzyme in aqueous alkaline conditions (pH 10-11) at mild temperature (40-50 °C) combined with a cascading membrane operation, yielding a collection of lignin fractions (from oligomeric down to mixture of tri-, di- and monomeric units) with distinct molecular weight distribution, low polydispersity and favourable physicochemical properties. The alkaline process conditions ensure the high processibility of crude lignin in an aqueous environment and the efficiency of the enzyme, yielding better compatibility of lignin towards targeted applications. The application of a selected lignin fraction produced by METNIN™ as a suitable lignopolyol to completely replace a commercial polyol in polyurethane rigid foam formulations is presented as a prototype. Liquid lignopolyols with a high lignin content were prepared by oxypropylation and their full utilization in the polyurethane rigid foam formulation was successfully demonstrated. Moreover, selected technical specifications of different foam demonstrators were determined, including closed cell count, water uptake and compression characteristics. These specifications are within industrial standards for rigid foam applications. The lignin loading in the lignopolyol was a major factor determining the properties of the foam. In addition to polyurethane foam demonstrators, other examples of lignin-based products related to resins and sizing applications will be presented.

Keywords: enzyme, lignin valorisation, polyol, polyurethane foam

Procedia PDF Downloads 146
27559 Harnessing Nigeria's Forestry Potential for Structural Applications: Structural Reliability of Nigerian Grown Opepe Timber

Authors: J. I. Aguwa, S. Sadiku, M. Abdullahi

Abstract:

This study examined the structural reliability of the Nigerian grown Opepe timber as bridge beam material. The strength of a particular specie of timber depends so much on some factors such as soil and environment in which it is grown. The steps involved are collection of the Opepe timber samples, seasoning/preparation of the test specimens, determination of the strength properties/statistical analysis, development of a computer programme in FORTRAN language and finally structural reliability analysis using FORM 5 software. The result revealed that the Nigerian grown Opepe is a reliable and durable structural bridge beam material for span of 5000mm, depth of 400mm, breadth of 250mm and end bearing length of 150mm. The probabilities of failure in bending parallel to the grain, compression perpendicular to the grain, shear parallel to the grain and deflection are 1.61 x 10-7, 1.43 x 10-8, 1.93 x 10-4 and 1.51 x 10-15 respectively. The paper recommends establishment of Opepe plantation in various Local Government Areas in Nigeria for structural applications such as in bridges, railway sleepers, generation of income to the nation as well as creating employment for the numerous unemployed youths.

Keywords: bending and deflection, bridge beam, compression, Nigerian Opepe, shear, structural reliability

Procedia PDF Downloads 455
27558 The Extent of Big Data Analysis by the External Auditors

Authors: Iyad Ismail, Fathilatul Abdul Hamid

Abstract:

This research was mainly investigated to recognize the extent of big data analysis by external auditors. This paper adopts grounded theory as a framework for conducting a series of semi-structured interviews with eighteen external auditors. The research findings comprised the availability extent of big data and big data analysis usage by the external auditors in Palestine, Gaza Strip. Considering the study's outcomes leads to a series of auditing procedures in order to improve the external auditing techniques, which leads to high-quality audit process. Also, this research is crucial for auditing firms by giving an insight into the mechanisms of auditing firms to identify the most important strategies that help in achieving competitive audit quality. These results are aims to instruct the auditing academic and professional institutions in developing techniques for external auditors in order to the big data analysis. This paper provides appropriate information for the decision-making process and a source of future information which affects technological auditing.

Keywords: big data analysis, external auditors, audit reliance, internal audit function

Procedia PDF Downloads 62
27557 Laser Writing on Vitroceramic Disks for Petabyte Data Storage

Authors: C. Busuioc, S. I. Jinga, E. Pavel

Abstract:

The continuous need of more non-volatile memories with a higher storage capacity, smaller dimensions and weight, as well as lower costs, has led to the exploration of optical lithography on active media, as well as patterned magnetic composites. In this context, optical lithography is a technique that can provide a significant decrease of the information bit size to the nanometric scale. However, there are some restrictions that arise from the need of breaking the optical diffraction limit. Major achievements have been obtained by employing a vitoceramic material as active medium and a laser beam operated at low power for the direct writing procedure. Thus, optical discs with ultra-high density were fabricated by a conventional melt-quenching method starting from analytical purity reagents. They were subsequently used for 3D recording based on their photosensitive features. Naturally, the next step consists in the elucidation of the composition and structure of the active centers, in correlation with the use of silver and rare-earth compounds for the synthesis of the optical supports. This has been accomplished by modern characterization methods, namely transmission electron microscopy coupled with selected area electron diffraction, scanning transmission electron microscopy and electron energy loss spectroscopy. The influence of laser diode parameters, silver concentration and fluorescent compounds formation on the writing process and final material properties was investigated. The results indicate performances in terms of capacity with two order of magnitude higher than other reported information storage systems. Moreover, the fluorescent photosensitive vitroceramics may be integrated in other applications which appeal to nanofabrication as the driving force in electronics and photonics fields.

Keywords: data storage, fluorescent compounds, laser writing, vitroceramics

Procedia PDF Downloads 224
27556 Performance Evaluation of an Inventive Co2 Gas Separation Inorganic Ceramic Membrane System

Authors: Ngozi Claribelle Nwogu, Mohammed Nasir Kajama, Oyoh Kechinyere, Edward Gobina

Abstract:

Atmospheric carbon dioxide emissions are considered as the greatest environmental challenge the world is facing today. The challenges to control the emissions include the recovery of CO2 from flue gas. This concern has been improved due to recent advances in materials process engineering resulting in the development of inorganic gas separation membranes with excellent thermal and mechanical stability required for most gas separations. This paper therefore evaluates the performance of a highly selective inorganic membrane for CO2 recovery applications. Analysis of results obtained is in agreement with experimental literature data. Further results show the prediction performance of the membranes for gas separation and the future direction of research. The materials selection and the membrane preparation techniques are discussed. Method of improving the interface defects in the membrane and its effect on the separation performance has also been reviewed and in addition advances to totally exploit the potential usage of this innovative membrane.

Keywords: carbon dioxide, gas separation, inorganic ceramic membrane, permselectivity

Procedia PDF Downloads 330
27555 A Model of Teacher Leadership in History Instruction

Authors: Poramatdha Chutimant

Abstract:

The objective of the research was to propose a model of teacher leadership in history instruction for utilization. Everett M. Rogers’ Diffusion of Innovations Theory is applied as theoretical framework. Qualitative method is to be used in the study, and the interview protocol used as an instrument to collect primary data from best practices who awarded by Office of National Education Commission (ONEC). Open-end questions will be used in interview protocol in order to gather the various data. Then, information according to international context of history instruction is the secondary data used to support in the summarizing process (Content Analysis). Dendrogram is a key to interpret and synthesize the primary data. Thus, secondary data comes as the supportive issue in explanation and elaboration. In-depth interview is to be used to collected information from seven experts in educational field. The focal point is to validate a draft model in term of future utilization finally.

Keywords: history study, nationalism, patriotism, responsible citizenship, teacher leadership

Procedia PDF Downloads 276
27554 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation

Authors: Mohammad Anwar, Shah Waliullah

Abstract:

This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.

Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model

Procedia PDF Downloads 65
27553 A Geosynchronous Orbit Synthetic Aperture Radar Simulator for Moving Ship Targets

Authors: Linjie Zhang, Baifen Ren, Xi Zhang, Genwang Liu

Abstract:

Ship detection is of great significance for both military and civilian applications. Synthetic aperture radar (SAR) with all-day, all-weather, ultra-long-range characteristics, has been used widely. In view of the low time resolution of low orbit SAR and the needs for high time resolution SAR data, GEO (Geosynchronous orbit) SAR is getting more and more attention. Since GEO SAR has short revisiting period and large coverage area, it is expected to be well utilized in marine ship targets monitoring. However, the height of the orbit increases the time of integration by almost two orders of magnitude. For moving marine vessels, the utility and efficacy of GEO SAR are still not sure. This paper attempts to find the feasibility of GEO SAR by giving a GEO SAR simulator of moving ships. This presented GEO SAR simulator is a kind of geometrical-based radar imaging simulator, which focus on geometrical quality rather than high radiometric. Inputs of this simulator are 3D ship model (.obj format, produced by most 3D design software, such as 3D Max), ship's velocity, and the parameters of satellite orbit and SAR platform. Its outputs are simulated GEO SAR raw signal data and SAR image. This simulating process is accomplished by the following four steps. (1) Reading 3D model, including the ship rotations (pitch, yaw, and roll) and velocity (speed and direction) parameters, extract information of those little primitives (triangles) which is visible from the SAR platform. (2) Computing the radar scattering from the ship with physical optics (PO) method. In this step, the vessel is sliced into many little rectangles primitives along the azimuth. The radiometric calculation of each primitive is carried out separately. Since this simulator only focuses on the complex structure of ships, only single-bounce reflection and double-bounce reflection are considered. (3) Generating the raw data with GEO SAR signal modeling. Since the normal ‘stop and go’ model is not available for GEO SAR, the range model should be reconsidered. (4) At last, generating GEO SAR image with improved Range Doppler method. Numerical simulation of fishing boat and cargo ship will be given. GEO SAR images of different posture, velocity, satellite orbit, and SAR platform will be simulated. By analyzing these simulated results, the effectiveness of GEO SAR for the detection of marine moving vessels is evaluated.

Keywords: GEO SAR, radar, simulation, ship

Procedia PDF Downloads 169
27552 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers

Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen

Abstract:

In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other. As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.

Keywords: AIS, ANN, ECG, hybrid classifiers, PSO

Procedia PDF Downloads 438
27551 Topic Modelling Using Latent Dirichlet Allocation and Latent Semantic Indexing on SA Telco Twitter Data

Authors: Phumelele Kubheka, Pius Owolawi, Gbolahan Aiyetoro

Abstract:

Twitter is one of the most popular social media platforms where users can share their opinions on different subjects. As of 2010, The Twitter platform generates more than 12 Terabytes of data daily, ~ 4.3 petabytes in a single year. For this reason, Twitter is a great source for big mining data. Many industries such as Telecommunication companies can leverage the availability of Twitter data to better understand their markets and make an appropriate business decision. This study performs topic modeling on Twitter data using Latent Dirichlet Allocation (LDA). The obtained results are benchmarked with another topic modeling technique, Latent Semantic Indexing (LSI). The study aims to retrieve topics on a Twitter dataset containing user tweets on South African Telcos. Results from this study show that LSI is much faster than LDA. However, LDA yields better results with higher topic coherence by 8% for the best-performing model represented in Table 1. A higher topic coherence score indicates better performance of the model.

Keywords: big data, latent Dirichlet allocation, latent semantic indexing, telco, topic modeling, twitter

Procedia PDF Downloads 145