Search results for: methodical framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5131

Search results for: methodical framework

3571 Understanding the Reasons for Flooding in Chennai and Strategies for Making It Flood Resilient

Authors: Nivedhitha Venkatakrishnan

Abstract:

Flooding in urban areas in India has become a usual ritual phenomenon and a nightmare to most cities, which is a consequence of man-made disruption resulting in disaster. The City planning in India falls short of withstanding hydro generated disasters. This has become a barrier and challenge in the process of development put forth by urbanization, high population density, expanding informal settlements, environment degradation from uncollected and untreated waste that flows into natural drains and water bodies, this has disrupted the natural mechanism of hazard protection such as drainage channels, wetlands and floodplains. The magnitude and the impact of the mishap was high because of the failure of development policies, strategies, plans that the city had adopted. In the current scenario, cities are becoming the home for future, with economic diversification bringing in more investment into cities especially in domains of Urban infrastructure, planning and design. The uncertainty of the Urban futures in these low elevated coastal zones faces an unprecedented risk and threat. The study on focuses on three major pillars of resilience such as Recover, Resist and Restore. This process of getting ready to handle the situation bridges the gap between disaster response management and risk reduction requires a shift in paradigm. The study involved a qualitative research and a system design approach (framework). The initial stages involved mapping out of the urban water morphology with respect to the spatial growth gave an insight of the water bodies that have gone missing over the years during the process of urbanization. The major finding of the study was missing links between traditional water harvesting network was a major reason resulting in a manmade disaster. The research conceptualized the ideology of a sponge city framework which would guide the growth through institutional frameworks at different levels. The next stage was on understanding the implementation process at various stage to ensure the shift in paradigm. Demonstration of the concepts at a neighborhood level where, how, what are the functions and benefits of each component. Quantifying the design decision with rainwater harvest, surface runoff and how much water is collected and how it could be collected, stored and reused. The study came with further recommendation for Water Mitigation Spaces that will revive the traditional harvesting network.

Keywords: flooding, man made disaster, resilient city, traditional harvesting network, waterbodies

Procedia PDF Downloads 140
3570 A Theoretical Framework of Multifactor Systematic Risks in Equity Market: Behavioral Finance Paradigm

Authors: Jasman Tuyon, Zamri Ahmad

Abstract:

Behavioral asset pricing research has been gaining momentum since in 1990s. However, it is still incomplete and has been criticized for some philosophical, theoretical and model specification limitations. Due to these drawbacks, investors’ behaviors as a source of risk in behavioral asset pricing modeling still remains disputable. This paper aims to address these issues with an alternative perspective based on behavioral finance paradigm. Specifically, this paper proposes a theoretical linkages of both fundamental and behavioral risks on stock prices formation and an extension of the multifactor stock pricing model by combining multi-factor fundamentals and behavioral risks factors.

Keywords: behavioral finance, multifactor asset pricing, behavioral risks, fundamental risks

Procedia PDF Downloads 499
3569 New Segmentation of Piecewise Moving-Average Model by Using Reversible Jump MCMC Algorithm

Authors: Suparman

Abstract:

This paper addresses the problem of the signal segmentation within a Bayesian framework by using reversible jump MCMC algorithm. The signal is modelled by piecewise constant Moving-Average (MA) model where the numbers of segments, the position of change-point, the order and the coefficient of the MA model for each segment are unknown. The reversible jump MCMC algorithm is then used to generate samples distributed according to the joint posterior distribution of the unknown parameters. These samples allow calculating some interesting features of the posterior distribution. The performance of the methodology is illustrated via several simulation results.

Keywords: piecewise, moving-average model, reversible jump MCMC, signal segmentation

Procedia PDF Downloads 227
3568 CertifHy: Developing a European Framework for the Generation of Guarantees of Origin for Green Hydrogen

Authors: Frederic Barth, Wouter Vanhoudt, Marc Londo, Jaap C. Jansen, Karine Veum, Javier Castro, Klaus Nürnberger, Matthias Altmann

Abstract:

Hydrogen is expected to play a key role in the transition towards a low-carbon economy, especially within the transport sector, the energy sector and the (petro)chemical industry sector. However, the production and use of hydrogen only make sense if the production and transportation are carried out with minimal impact on natural resources, and if greenhouse gas emissions are reduced in comparison to conventional hydrogen or conventional fuels. The CertifHy project, supported by a wide range of key European industry leaders (gas companies, chemical industry, energy utilities, green hydrogen technology developers and automobile manufacturers, as well as other leading industrial players) therefore aims to: 1. Define a widely acceptable definition of green hydrogen. 2. Determine how a robust Guarantee of Origin (GoO) scheme for green hydrogen should be designed and implemented throughout the EU. It is divided into the following work packages (WPs). 1. Generic market outlook for green hydrogen: Evidence of existing industrial markets and the potential development of new energy related markets for green hydrogen in the EU, overview of the segments and their future trends, drivers and market outlook (WP1). 2. Definition of “green” hydrogen: step-by-step consultation approach leading to a consensus on the definition of green hydrogen within the EU (WP2). 3. Review of existing platforms and interactions between existing GoO and green hydrogen: Lessons learnt and mapping of interactions (WP3). 4. Definition of a framework of guarantees of origin for “green” hydrogen: Technical specifications, rules and obligations for the GoO, impact analysis (WP4). 5. Roadmap for the implementation of an EU-wide GoO scheme for green hydrogen: the project implementation plan will be presented to the FCH JU and the European Commission as the key outcome of the project and shared with stakeholders before finalisation (WP5 and 6). Definition of Green Hydrogen: CertifHy Green hydrogen is hydrogen from renewable sources that is also CertifHy Low-GHG-emissions hydrogen. Hydrogen from renewable sources is hydrogen belonging to the share of production equal to the share of renewable energy sources (as defined in the EU RES directive) in energy consumption for hydrogen production, excluding ancillary functions. CertifHy Low-GHG hydrogen is hydrogen with emissions lower than the defined CertifHy Low-GHG-emissions threshold, i.e. 36.4 gCO2eq/MJ, produced in a plant where the average emissions intensity of the non-CertifHy Low-GHG hydrogen production (based on an LCA approach), since sign-up or in the past 12 months, does not exceed the emissions intensity of the benchmark process (SMR of natural gas), i.e. 91.0 gCO2eq/MJ.

Keywords: green hydrogen, cross-cutting, guarantee of origin, certificate, DG energy, bankability

Procedia PDF Downloads 493
3567 Frequent Item Set Mining for Big Data Using MapReduce Framework

Authors: Tamanna Jethava, Rahul Joshi

Abstract:

Frequent Item sets play an essential role in many data Mining tasks that try to find interesting patterns from the database. Typically it refers to a set of items that frequently appear together in transaction dataset. There are several mining algorithm being used for frequent item set mining, yet most do not scale to the type of data we presented with today, so called “BIG DATA”. Big Data is a collection of large data sets. Our approach is to work on the frequent item set mining over the large dataset with scalable and speedy way. Big Data basically works with Map Reduce along with HDFS is used to find out frequent item sets from Big Data on large cluster. This paper focuses on using pre-processing & mining algorithm as hybrid approach for big data over Hadoop platform.

Keywords: frequent item set mining, big data, Hadoop, MapReduce

Procedia PDF Downloads 436
3566 A 'Four Method Framework' for Fighting Software Architecture Erosion

Authors: Sundus Ayyaz, Saad Rehman, Usman Qamar

Abstract:

Software Architecture is the basic structure of software that states the development and advancement of a software system. Software architecture is also considered as a significant tool for the construction of high quality software systems. A clean design leads to the control, value and beauty of software resulting in its longer life while a bad design is the cause of architectural erosion where a software evolution completely fails. This paper discusses the occurrence of software architecture erosion and presents a set of methods for the detection, declaration and prevention of architecture erosion. The causes and symptoms of architecture erosion are observed with the examples of prescriptive and descriptive architectures and the practices used to stop this erosion are also discussed by considering different types of software erosion and their affects. Consequently finding and devising the most suitable approach for fighting software architecture erosion and in some way reducing its affect is evaluated and tested on different scenarios.

Keywords: software architecture, architecture erosion, prescriptive architecture, descriptive architecture

Procedia PDF Downloads 500
3565 By-Line Analysis of Determinants Insurance Premiums : Evidence from Tunisian Market

Authors: Nadia Sghaier

Abstract:

In this paper, we aim to identify the determinants of the life and non-life insurance premiums of different lines for the case of the Tunisian insurance market over a recent period from 1997 to 2019. The empirical analysis is conducted using the linear cointegration techniques in the panel data framework, which allow both long and short-run relationships. The obtained results show evidence of long-run relationship between premiums, losses, and financial variables (stock market indices and interest rate). Furthermore, we find that the short-run effect of explanatory variables differs across lines. This finding has important implications for insurance tarification and regulation.

Keywords: insurance premiums, lines, Tunisian insurance market, cointegration approach in panel data

Procedia PDF Downloads 198
3564 Terraria AI: YOLO Interface for Decision-Making Algorithms

Authors: Emmanuel Barrantes Chaves, Ernesto Rivera Alvarado

Abstract:

This paper presents a method to enable agents for the Terraria game to evaluate algorithms commonly used in general video game artificial intelligence competitions. The usage of the ‘You Only Look Once’ model in the first layer of the process obtains information from the screen, translating this information into a video game description language known as “Video Game Description Language”; the agents take that as input to make decisions. For this, the state-of-the-art algorithms were tested and compared; Monte Carlo Tree Search and Rolling Horizon Evolutionary; in this case, Rolling Horizon Evolutionary shows a better performance. This approach’s main advantage is that a VGDL beforehand is unnecessary. It will be built on the fly and opens the road for using more games as a framework for AI.

Keywords: AI, MCTS, RHEA, Terraria, VGDL, YOLOv5

Procedia PDF Downloads 96
3563 Radar-Based Classification of Pedestrian and Dog Using High-Resolution Raw Range-Doppler Signatures

Authors: C. Mayr, J. Periya, A. Kariminezhad

Abstract:

In this paper, we developed a learning framework for the classification of vulnerable road users (VRU) by their range-Doppler signatures. The frequency-modulated continuous-wave (FMCW) radar raw data is first pre-processed to obtain robust object range-Doppler maps per coherent time interval. The complex-valued range-Doppler maps captured from our outdoor measurements are further fed into a convolutional neural network (CNN) to learn the classification. This CNN has gone through a hyperparameter optimization process for improved learning. By learning VRU range-Doppler signatures, the three classes 'pedestrian', 'dog', and 'noise' are classified with an average accuracy of almost 95%. Interestingly, this classification accuracy holds for a combined longitudinal and lateral object trajectories.

Keywords: machine learning, radar, signal processing, autonomous driving

Procedia PDF Downloads 245
3562 Overview and Future Opportunities of Sarcasm Detection on Social Media Communications

Authors: Samaneh Nadali, Masrah Azrifah Azmi Murad, Nurfadhlina Mohammad Sharef

Abstract:

Sarcasm is a common phenomenon in social media which is a nuanced form of language for stating the opposite of what is implied. Due to the intentional ambiguity, analysis of sarcasm is a difficult task not only for a machine but even for a human. Although sarcasm detection has an important effect on sentiment, it is usually ignored in social media analysis because sarcasm analysis is too complicated. While there is a few systems exist which can detect sarcasm, almost no work has been carried out on a study and the review of the existing work in this area. This survey presents a nearly full image of sarcasm detection techniques and the related fields with brief details. The main contributions of this paper include the illustration of the recent trend of research in the sarcasm analysis and we highlight the gaps and propose a new framework that can be explored.

Keywords: sarcasm detection, sentiment analysis, social media, sarcasm analysis

Procedia PDF Downloads 458
3561 Analysis and Design Modeling for Next Generation Network Intrusion Detection and Prevention System

Authors: Nareshkumar Harale, B. B. Meshram

Abstract:

The continued exponential growth of successful cyber intrusions against today’s businesses has made it abundantly clear that traditional perimeter security measures are no longer adequate and effective. We evolved the network trust architecture from trust-untrust to Zero-Trust, With Zero Trust, essential security capabilities are deployed in a way that provides policy enforcement and protection for all users, devices, applications, data resources, and the communications traffic between them, regardless of their location. Information exchange over the Internet, in spite of inclusion of advanced security controls, is always under innovative, inventive and prone to cyberattacks. TCP/IP protocol stack, the adapted standard for communication over network, suffers from inherent design vulnerabilities such as communication and session management protocols, routing protocols and security protocols are the major cause of major attacks. With the explosion of cyber security threats, such as viruses, worms, rootkits, malwares, Denial of Service attacks, accomplishing efficient and effective intrusion detection and prevention is become crucial and challenging too. In this paper, we propose a design and analysis model for next generation network intrusion detection and protection system as part of layered security strategy. The proposed system design provides intrusion detection for wide range of attacks with layered architecture and framework. The proposed network intrusion classification framework deals with cyberattacks on standard TCP/IP protocol, routing protocols and security protocols. It thereby forms the basis for detection of attack classes and applies signature based matching for known cyberattacks and data mining based machine learning approaches for unknown cyberattacks. Our proposed implemented software can effectively detect attacks even when malicious connections are hidden within normal events. The unsupervised learning algorithm applied to network audit data trails results in unknown intrusion detection. Association rule mining algorithms generate new rules from collected audit trail data resulting in increased intrusion prevention though integrated firewall systems. Intrusion response mechanisms can be initiated in real-time thereby minimizing the impact of network intrusions. Finally, we have shown that our approach can be validated and how the analysis results can be used for detecting and protection from the new network anomalies.

Keywords: network intrusion detection, network intrusion prevention, association rule mining, system analysis and design

Procedia PDF Downloads 227
3560 Effects of an Envious Experience on Schadenfreude and Economic Decisions Making

Authors: Pablo Reyes, Vanessa Riveros Fiallo, Cesar Acevedo, Camila Castellanos, Catalina Moncaleano, Maria F. Parra, Laura Colmenares

Abstract:

Social emotions are physiological, cognitive and behavioral phenomenon that intervene in the mechanisms of adaptation of individuals and their context. These are mediated by interpersonal relationship and language. Such emotions are subdivided into moral and comparison. The present research emphasizes two comparative emotions: Envy and Schadenfreude. Envy arises when a person lack of quality, possessions or achievements and these are superior in someone else. The Schadenfreude (SC) expresses the pleasure that someone experienced by the misfortune of the other. The relationship between both emotions has been questioned before. Hence there are reports showing that envy increases and modulates SC response. Other documents suggest that envy causes SC response. However, the methodological approach of the topic has been made through self-reports, as well as the hypothetical scenarios. Given this problematic, the neuroscience social framework provides an alternative and demonstrates that social emotions have neurophysiological correlates that can be measured. This is relevant when studying social emotions that are reprehensible like envy or SC are. When tested, the individuals tend to report low ratings due to social desirability. In this study, it was drawn up a proposal in research's protocol and the progress on its own piloting. The aim is to evaluate the effect of feeling envy and Schadenfreude has on the decision-making process, as well as the cooperative behavior in an economic game. To such a degree, it was proposed an experimental model that will provoke to feel envious by performing games against an unknown opponent. The game consists of asking general knowledge questions. The difficulty level in questions and the strangers' facial response have been manipulated in order to generate an ecological comparison framework and be able to arise both envy and SC emotions. During the game, an electromyography registry will be made for two facial muscles that have been associated with the expressiveness of envy and SC emotions. One of the innovations of the current proposal is the measurement of the effect that emotions have on a specific behavior. To that extent, it was evaluated the effect of each condition on the dictators' economic game. The main intention is to evaluate if a social emotion can modulate actions that have been associated with social norms, in the literacy. The result of the evaluation of a pilot model (without electromyography record and self-report) have shown an association between envy and SC, in a way that as the individuals report a greater sense of envy, the greater the chance to experience SC. The results of the economic game show a slight tendency towards profit maximization decisions. It is expected that at the time of using real cash this behavior will be strengthened and also to correlate with the responses of electromyography.

Keywords: envy, schadenfreude, electromyography, economic games

Procedia PDF Downloads 371
3559 Efficient Management of Construction Logistics: A Challenge to Both Conventional and Technological Systems in the Developing Nations

Authors: Nuruddeen Usman, Ahmad Muhammad Ibrahim

Abstract:

Management of construction logistics at construction sites becomes increasingly complex with rising construction volume, which made it relatively inefficient in the developing nations even with the technological advancement. The objective of this research is to conceptually synthesise the approaches and challenges befall in the course of construction logistic management, with the aim to proffer possible solution to it. Therefore, this study appraised the glitches associated with both conventional and technological methods of construction logistic management that result in its inefficiency. Thus, this investigation found that, both conventional and the technological issues were due to certain obstacles that affect the construction logistic management which resulted into delays, accidents, fraudulent activities, time and cost overrun. Therefore, this study has developed a framework that might bring a lasting solution to the challenges of construction logistic management.

Keywords: construction, conventional, logistic, technological

Procedia PDF Downloads 554
3558 A Study on Big Data Analytics, Applications and Challenges

Authors: Chhavi Rana

Abstract:

The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, Healthcare, and business intelligence contain voluminous and incremental data, which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organization's decision-making strategy can be enhanced using big data analytics and applying different machine learning techniques and statistical tools on such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates on various frameworks in the process of Analysis using different machine-learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.

Keywords: big data, big data analytics, machine learning, review

Procedia PDF Downloads 83
3557 A Study on Big Data Analytics, Applications, and Challenges

Authors: Chhavi Rana

Abstract:

The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, healthcare, and business intelligence contain voluminous and incremental data which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organisation decision-making strategy can be enhanced by using big data analytics and applying different machine learning techniques and statistical tools to such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates various frameworks in the process of analysis using different machine learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.

Keywords: big data, big data analytics, machine learning, review

Procedia PDF Downloads 95
3556 Multiscale Edge Detection Based on Nonsubsampled Contourlet Transform

Authors: Enqing Chen, Jianbo Wang

Abstract:

It is well known that the wavelet transform provides a very effective framework for multiscale edges analysis. However, wavelets are not very effective in representing images containing distributed discontinuities such as edges. In this paper, we propose a novel multiscale edge detection method in nonsubsampled contourlet transform (NSCT) domain, which is based on the dominant multiscale, multidirection edge expression and outstanding edge location of NSCT. Through real images experiments, simulation results demonstrate that the proposed method is better than other edge detection methods based on Canny operator, wavelet and contourlet. Additionally, the proposed method also works well for noisy images.

Keywords: edge detection, NSCT, shift invariant, modulus maxima

Procedia PDF Downloads 488
3555 Deep Learning for Image Correction in Sparse-View Computed Tomography

Authors: Shubham Gogri, Lucia Florescu

Abstract:

Medical diagnosis and radiotherapy treatment planning using Computed Tomography (CT) rely on the quantitative accuracy and quality of the CT images. At the same time, requirements for CT imaging include reducing the radiation dose exposure to patients and minimizing scanning time. A solution to this is the sparse-view CT technique, based on a reduced number of projection views. This, however, introduces a new problem— the incomplete projection data results in lower quality of the reconstructed images. To tackle this issue, deep learning methods have been applied to enhance the quality of the sparse-view CT images. A first approach involved employing Mir-Net, a dedicated deep neural network designed for image enhancement. This showed promise, utilizing an intricate architecture comprising encoder and decoder networks, along with the incorporation of the Charbonnier Loss. However, this approach was computationally demanding. Subsequently, a specialized Generative Adversarial Network (GAN) architecture, rooted in the Pix2Pix framework, was implemented. This GAN framework involves a U-Net-based Generator and a Discriminator based on Convolutional Neural Networks. To bolster the GAN's performance, both Charbonnier and Wasserstein loss functions were introduced, collectively focusing on capturing minute details while ensuring training stability. The integration of the perceptual loss, calculated based on feature vectors extracted from the VGG16 network pretrained on the ImageNet dataset, further enhanced the network's ability to synthesize relevant images. A series of comprehensive experiments with clinical CT data were conducted, exploring various GAN loss functions, including Wasserstein, Charbonnier, and perceptual loss. The outcomes demonstrated significant image quality improvements, confirmed through pertinent metrics such as Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM) between the corrected images and the ground truth. Furthermore, learning curves and qualitative comparisons added evidence of the enhanced image quality and the network's increased stability, while preserving pixel value intensity. The experiments underscored the potential of deep learning frameworks in enhancing the visual interpretation of CT scans, achieving outcomes with SSIM values close to one and PSNR values reaching up to 76.

Keywords: generative adversarial networks, sparse view computed tomography, CT image correction, Mir-Net

Procedia PDF Downloads 162
3554 Improving Screening and Treatment of Binge Eating Disorders in Pediatric Weight Management Clinic through a Quality Improvement Framework

Authors: Cristina Fernandez, Felix Amparano, John Tumberger, Stephani Stancil, Sarah Hampl, Brooke Sweeney, Amy R. Beck, Helena H Laroche, Jared Tucker, Eileen Chaves, Sara Gould, Matthew Lindquist, Lora Edwards, Renee Arensberg, Meredith Dreyer, Jazmine Cedeno, Alleen Cummins, Jennifer Lisondra, Katie Cox, Kelsey Dean, Rachel Perera, Nicholas A. Clark

Abstract:

Background: Adolescents with obesity are at higher risk of disordered eating than the general population. Detection of eating disorders (ED) is difficult. Screening questionnaires may aid in early detection of ED. Our team’s prior efforts focused on increasing ED screening rates to ≥90% using a validated 10-question adolescent binge eating disorder screening questionnaire (ADO-BED). This aim was achieved. We then aimed to improve treatment plan initiation of patients ≥12 years of age who screen positive for BED within our WMC from 33% to 70% within 12 months. Methods: Our WMC is within a tertiary-care, free-standing children’s hospital. A3, an improvement framework, was used. A multidisciplinary team (physicians, nurses, registered dietitians, psychologists, and exercise physiologists) was created. The outcome measure was documentation of treatment plan initiation of those who screen positive (goal 70%). The process measure was ADO-BED screening rate of WMC patients (goal ≥90%). Plan-Do-Study-Act (PDSA) cycle 1 included provider education on current literature and treatment plan initiation based upon ADO-BED responses. PDSA 2 involved increasing documentation of treatment plan and retrain process to providers. Pre-defined treatment plans were: 1) repeat screen in 3-6 months, 2) resources provided only, or 3) comprehensive multidisciplinary weight management team evaluation. Run charts monitored impact over time. Results: Within 9 months, 166 patients were seen in WMC. Process measure showed sustained performance above goal (mean 98%). Outcome measure showed special cause improvement from mean of 33% to 100% (n=31). Of treatment plans provided, 45% received Plan 1, 4% Plan 2, and 46% Plan 3. Conclusion: Through a multidisciplinary improvement team approach, we maintained sustained ADO-BED screening performance, and, prior to our 12-month timeline, achieved our project aim. Our efforts may serve as a model for other multidisciplinary WMCs. Next steps may include expanding project scope to other WM programs.

Keywords: obesity, pediatrics, clinic, eating disorder

Procedia PDF Downloads 63
3553 A Framework for Incorporating Non-Linear Degradation of Conductive Adhesive in Environmental Testing

Authors: Kedar Hardikar, Joe Varghese

Abstract:

Conductive adhesives have found wide-ranging applications in electronics industry ranging from fixing a defective conductor on printed circuit board (PCB) attaching an electronic component in an assembly to protecting electronics components by the formation of “Faraday Cage.” The reliability requirements for the conductive adhesive vary widely depending on the application and expected product lifetime. While the conductive adhesive is required to maintain the structural integrity, the electrical performance of the associated sub-assembly can be affected by the degradation of conductive adhesive. The degradation of the adhesive is dependent upon the highly varied use case. The conventional approach to assess the reliability of the sub-assembly involves subjecting it to the standard environmental test conditions such as high-temperature high humidity, thermal cycling, high-temperature exposure to name a few. In order to enable projection of test data and observed failures to predict field performance, systematic development of an acceleration factor between the test conditions and field conditions is crucial. Common acceleration factor models such as Arrhenius model are based on rate kinetics and typically rely on an assumption of linear degradation in time for a given condition and test duration. The application of interest in this work involves conductive adhesive used in an electronic circuit of a capacitive sensor. The degradation of conductive adhesive in high temperature and humidity environment is quantified by the capacitance values. Under such conditions, the use of established models such as Hallberg-Peck model or Eyring Model to predict time to failure in the field typically relies on linear degradation rate. In this particular case, it is seen that the degradation is nonlinear in time and exhibits a square root t dependence. It is also shown that for the mechanism of interest, the presence of moisture is essential, and the dominant mechanism driving the degradation is the diffusion of moisture. In this work, a framework is developed to incorporate nonlinear degradation of the conductive adhesive for the development of an acceleration factor. This method can be extended to applications where nonlinearity in degradation rate can be adequately characterized in tests. It is shown that depending on the expected product lifetime, the use of conventional linear degradation approach can overestimate or underestimate the field performance. This work provides guidelines for suitability of linear degradation approximation for such varied applications

Keywords: conductive adhesives, nonlinear degradation, physics of failure, acceleration factor model.

Procedia PDF Downloads 135
3552 The Right to Data Portability and Its Influence on the Development of Digital Services

Authors: Roman Bieda

Abstract:

The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.

Keywords: data portability, digital market, GDPR, personal data

Procedia PDF Downloads 473
3551 The Role of the Board of Directors and Chief Executive Officers in Leading and Embedding Corporate Social Responsibility within Corporate Governance Regulations

Authors: Khalid Alshaikh

Abstract:

In recent years, leadership, Corporate Governance (CG) and Corporate Social Responsibility (CSR) have been under scrutiny in the Libyan society. Scholars and institutions have commenced investigating the possible resolutions they can arrange to alleviate the economic, social and environmental problems the war has produced. Thus far, these constructs requisite an in-depth reinvestigation, reconceptualization, and analysis to clearly reconstruct their rules and regulations. With the demise of Qaddafi’s regime, levels, degrees, and efforts to apply CG regulations have varied in public and private commercial banks. CSR is a new organizational culture that still designs its route within these financial institutions. Detaching itself from any notion of dictatorship and autocratic traits, leadership counts on transformational and transactional styles. Therefore, this paper investigates the extent to which the Board of Directors and Chief Executive Officers (CEOs) redefine these concepts and how they entrench CSR within the framework of CG. The research methodology used both public and private banks as a case study and qualitative research to interview ten Board of Directors (BoDs) and eleven Chief executive managers to explore how leadership, CG, and CSR are defined and how leadership integrates CSR into CG structures. The findings suggest that the CG framework in Libya still requires great efforts to be developed. Full CG code implementation appears daunting. Also, the CSR is still influenced by the power of religion. Nevertheless, the Islamic perspective is more consistent with the social contract concept of the CSR. The Libyan commercial banks do not solely focus on the economic side of maximizing profits, but also concentrate on its morality. The issue is that CSR activities are not enough to achieve good charity publicly and needs strategies to address major social issues. Moreover, leadership is more transformational and transactional and endeavors to make economic, social and environmental changes, but these changes are curtailed by tradition and traditional values dominating the Libyan social life where religious and tribal practices establish the relationship between leaders and their subordinates. Finally, the findings reveal that transformational and transactional leadership styles encourage the incorporation of CSR into the CG regulations. The boardroom and executive management have such a particular role in flagging up how embedded corporate Social responsibility is in organizational culture across the commercial banks, yet it is still important that the BoDs and CEOs need to do much more to embed corporate social responsibility through their core functions. They need to boost their standing to be more influential and make sure that the right discussions about CSR happen with the right stakeholders involved.

Keywords: board of directors, chief executive officers, corporate governance, corporate social responsibility

Procedia PDF Downloads 171
3550 The Use of Social Networking Sites in eLearning

Authors: Clifford De Raffaele, Luana Bugeja, Serengul Smith

Abstract:

The adaptation of social networking sites within higher education has garnered significant interest in the recent years with numerous researches considering it as a possible shift from the traditional classroom based learning paradigm. Notwithstanding this increase in research and conducted studies however, the adaption of SNS based modules have failed to proliferate within Universities. This paper, commences its contribution by analyzing the various models and theories proposed in literature and amalgamates together various effective aspects for the inclusion of social technology within e-Learning. A three phased framework is further proposed which details the necessary considerations for the successful adaptation of SNS in enhancing the students learning experience. This proposal outlines the theoretical foundations which will be analyzed in practical implementation across international university campuses.

Keywords: eLearning, higher education, social network sites, student learning

Procedia PDF Downloads 340
3549 Protection of Television Programme Formats in Comparative Law

Authors: Mustafa Arikan, Ibrahim Ercan

Abstract:

In this paper, protection of program formats was investigated in terms of program formats. Protection of program formats was studied in the French Law in the sense of competition law and CPI. Since the English Judicial system exhibits differences from the legal system of Continental Europe, its investigation bears a special significance. The subject was also handled in German Law at length. Indeed, German Law was investigated in detail within the overall framework of the study. Here, the court decisions in the German Law and the views in the doctrine were expressed in general. There are many court decisions in the American legal system concerning the subject. These decisions also present alternatives in terms of a solution to the problem.

Keywords: comparative law, protection of television programme formats, intellectual property, american legal system

Procedia PDF Downloads 331
3548 Social Business Models: When Profits and Impacts Are Not at Odds

Authors: Elisa Pautasso, Matteo Castagno, Michele Osella

Abstract:

In the last decade, the emergence of new social needs as an effect of the economic crisis has stimulated the flourishing of business endeavours characterised by explicit social goals. Social start-ups, social enterprises or Corporate Social Responsibility operations carried out by traditional companies are quintessential examples in this regard. This paper analyses these kinds of initiatives in order to discover the main characteristics of social business models and to provide insights to social entrepreneurs for developing or improving their strategies. The research is conducted through the integration of literature review and case study analysis and, thanks to the recognition of the importance of both profits and social impacts as the key success factors for a social business model, proposes a framework for identifying indicators suitable for measuring the social impacts generated.

Keywords: business model, case study, impacts, social business

Procedia PDF Downloads 349
3547 Genomics of Aquatic Adaptation

Authors: Agostinho Antunes

Abstract:

The completion of the human genome sequencing in 2003 opened a new perspective into the importance of whole genome sequencing projects, and currently multiple species are having their genomes completed sequenced, from simple organisms, such as bacteria, to more complex taxa, such as mammals. This voluminous sequencing data generated across multiple organisms provides also the framework to better understand the genetic makeup of such species and related ones, allowing to explore the genetic changes underlining the evolution of diverse phenotypic traits. Here, recent results from our group retrieved from comparative evolutionary genomic analyses of selected marine animal species will be considered to exemplify how gene novelty and gene enhancement by positive selection might have been determinant in the success of adaptive radiations into diverse habitats and lifestyles.

Keywords: comparative genomics, adaptive evolution, bioinformatics, phylogenetics, genome mining

Procedia PDF Downloads 533
3546 A Measuring Industrial Resiliency by Using Data Envelopment Analysis Approach

Authors: Ida Bagus Made Putra Jandhana, Teuku Yuri M. Zagloel, Rahmat Nurchayo

Abstract:

Having several crises that affect industrial sector performance in the past decades, decision makers should utilize measurement application that enables them to measure industrial resiliency more precisely. It provides not only a framework for the development of resilience measurement application, but also several theories for the concept building blocks, such as performance measurement management, and resilience engineering in real world environment. This research is a continuation of previously published paper on performance measurement in the industrial sector. Finally, this paper contributes an alternative performance measurement method in industrial sector based on resilience concept. Moreover, this research demonstrates how applicable the concept of resilience engineering is and its method of measurement.

Keywords: industrial, measurement, resilience, sector

Procedia PDF Downloads 278
3545 Wrapping–Decorative Movement of Time

Authors: Rudranil Das

Abstract:

Wrapping is a basic textile technique; it is having a great quality of decorative view. Since long back it has been embellishing life of people and their culture in different forms. It links cultures, beliefs, thoughts, technology, and above all, people. Through etymology we can study the movement of the word power of wrapping undoubtedly but in depth analyze it could provide many concepts of structural ability. Only in India, more than 105 different processes exist in the way of saree [a type of women attire] wrapping. Then many more other clothing we found in allover world which connects this technique and construction too. One of the main objectives of this study is to enrich wrapping explanation and come up with surfaces by this technique. The deliberate more fragile and stretchable structural framework makes it more appropriate in different users according to their necessity. Developments of design and technology could create new industry segment and generate a marginalized employment for the people too.

Keywords: concept, existence, philosophical attachment, technological advancement

Procedia PDF Downloads 231
3544 Application of Balance Score Card (BSc) in Education: Case of the International University

Authors: Hieu Nguyen

Abstract:

Performance management is the concern of any organizations in the context of increasing demand and fierce competition between education institution. This paper draws together the performance management concepts and focuses specifically to Balance Scorecard in the context of education. The study employs semi-structured in-depth interview to explore the measurement items for each of the sub-objectives in the four perspectives. Each of the perspectives’ explored measurement items will then be discussed the role and influence of them towards the perspective and how to improve the measurements to have improved performance management. Finally, the measurements will be put together as a suggested balanced scorecard framework in the case of International University.

Keywords: performance management, education institution, balance scorecard, measurement items, four perspectives, international univeristy

Procedia PDF Downloads 411
3543 Comparing Deep Architectures for Selecting Optimal Machine Translation

Authors: Despoina Mouratidis, Katia Lida Kermanidis

Abstract:

Machine translation (MT) is a very important task in Natural Language Processing (NLP). MT evaluation is crucial in MT development, as it constitutes the means to assess the success of an MT system, and also helps improve its performance. Several methods have been proposed for the evaluation of (MT) systems. Some of the most popular ones in automatic MT evaluation are score-based, such as the BLEU score, and others are based on lexical similarity or syntactic similarity between the MT outputs and the reference involving higher-level information like part of speech tagging (POS). This paper presents a language-independent machine learning framework for classifying pairwise translations. This framework uses vector representations of two machine-produced translations, one from a statistical machine translation model (SMT) and one from a neural machine translation model (NMT). The vector representations consist of automatically extracted word embeddings and string-like language-independent features. These vector representations used as an input to a multi-layer neural network (NN) that models the similarity between each MT output and the reference, as well as between the two MT outputs. To evaluate the proposed approach, a professional translation and a "ground-truth" annotation are used. The parallel corpora used are English-Greek (EN-GR) and English-Italian (EN-IT), in the educational domain and of informal genres (video lecture subtitles, course forum text, etc.) that are difficult to be reliably translated. They have tested three basic deep learning (DL) architectures to this schema: (i) fully-connected dense, (ii) Convolutional Neural Network (CNN), and (iii) Long Short-Term Memory (LSTM). Experiments show that all tested architectures achieved better results when compared against those of some of the well-known basic approaches, such as Random Forest (RF) and Support Vector Machine (SVM). Better accuracy results are obtained when LSTM layers are used in our schema. In terms of a balance between the results, better accuracy results are obtained when dense layers are used. The reason for this is that the model correctly classifies more sentences of the minority class (SMT). For a more integrated analysis of the accuracy results, a qualitative linguistic analysis is carried out. In this context, problems have been identified about some figures of speech, as the metaphors, or about certain linguistic phenomena, such as per etymology: paronyms. It is quite interesting to find out why all the classifiers led to worse accuracy results in Italian as compared to Greek, taking into account that the linguistic features employed are language independent.

Keywords: machine learning, machine translation evaluation, neural network architecture, pairwise classification

Procedia PDF Downloads 132
3542 Automated Java Testing: JUnit versus AspectJ

Authors: Manish Jain, Dinesh Gopalani

Abstract:

Growing dependency of mankind on software technology increases the need for thorough testing of the software applications and automated testing techniques that support testing activities. We have outlined our testing strategy for performing various types of automated testing of Java applications using AspectJ which has become the de-facto standard for Aspect Oriented Programming (AOP). Likewise JUnit, a unit testing framework is the most popular Java testing tool. In this paper, we have evaluated our proposed AOP approach for automated testing and JUnit on various parameters. First we have provided the similarity between the two approaches and then we have done a detailed comparison of the two testing techniques on factors like lines of testing code, learning curve, testing of private members etc. We established that our AOP testing approach using AspectJ has got several advantages and is thus particularly more effective than JUnit.

Keywords: aspect oriented programming, AspectJ, aspects, JU-nit, software testing

Procedia PDF Downloads 331