Search results for: survival data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25328

Search results for: survival data

24758 An Empirical Study of the Impacts of Big Data on Firm Performance

Authors: Thuan Nguyen

Abstract:

In the present time, data to a data-driven knowledge-based economy is the same as oil to the industrial age hundreds of years ago. Data is everywhere in vast volumes! Big data analytics is expected to help firms not only efficiently improve performance but also completely transform how they should run their business. However, employing the emergent technology successfully is not easy, and assessing the roles of big data in improving firm performance is even much harder. There was a lack of studies that have examined the impacts of big data analytics on organizational performance. This study aimed to fill the gap. The present study suggested using firms’ intellectual capital as a proxy for big data in evaluating its impact on organizational performance. The present study employed the Value Added Intellectual Coefficient method to measure firm intellectual capital, via its three main components: human capital efficiency, structural capital efficiency, and capital employed efficiency, and then used the structural equation modeling technique to model the data and test the models. The financial fundamental and market data of 100 randomly selected publicly listed firms were collected. The results of the tests showed that only human capital efficiency had a significant positive impact on firm profitability, which highlighted the prominent human role in the impact of big data technology.

Keywords: big data, big data analytics, intellectual capital, organizational performance, value added intellectual coefficient

Procedia PDF Downloads 231
24757 Automated Test Data Generation For some types of Algorithm

Authors: Hitesh Tahbildar

Abstract:

The cost of test data generation for a program is computationally very high. In general case, no algorithm to generate test data for all types of algorithms has been found. The cost of generating test data for different types of algorithm is different. Till date, people are emphasizing the need to generate test data for different types of programming constructs rather than different types of algorithms. The test data generation methods have been implemented to find heuristics for different types of algorithms. Some algorithms that includes divide and conquer, backtracking, greedy approach, dynamic programming to find the minimum cost of test data generation have been tested. Our experimental results say that some of these types of algorithm can be used as a necessary condition for selecting heuristics and programming constructs are sufficient condition for selecting our heuristics. Finally we recommend the different heuristics for test data generation to be selected for different types of algorithms.

Keywords: ongest path, saturation point, lmax, kL, kS

Procedia PDF Downloads 396
24756 MANIFEST-2, a Global, Phase 3, Randomized, Double-Blind, Active-Control Study of Pelabresib (CPI-0610) and Ruxolitinib vs. Placebo and Ruxolitinib in JAK Inhibitor-Naïve Myelofibrosis Patients

Authors: Claire Harrison, Raajit K. Rampal, Vikas Gupta, Srdan Verstovsek, Moshe Talpaz, Jean-Jacques Kiladjian, Ruben Mesa, Andrew Kuykendall, Alessandro Vannucchi, Francesca Palandri, Sebastian Grosicki, Timothy Devos, Eric Jourdan, Marielle J. Wondergem, Haifa Kathrin Al-Ali, Veronika Buxhofer-Ausch, Alberto Alvarez-Larrán, Sanjay Akhani, Rafael Muñoz-Carerras, Yury Sheykin, Gozde Colak, Morgan Harris, John Mascarenhas

Abstract:

Myelofibrosis (MF) is characterized by bone marrow fibrosis, anemia, splenomegaly and constitutional symptoms. Progressive bone marrow fibrosis results from aberrant megakaryopoeisis and expression of proinflammatory cytokines, both of which are heavily influenced by bromodomain and extraterminal domain (BET)-mediated gene regulation and lead to myeloproliferation and cytopenias. Pelabresib (CPI-0610) is an oral small-molecule investigational inhibitor of BET protein bromodomains currently being developed for the treatment of patients with MF. It is designed to downregulate BET target genes and modify nuclear factor kappa B (NF-κB) signaling. MANIFEST-2 was initiated based on data from Arm 3 of the ongoing Phase 2 MANIFEST study (NCT02158858), which is evaluating the combination of pelabresib and ruxolitinib in Janus kinase inhibitor (JAKi) treatment-naïve patients with MF. Primary endpoint analyses showed splenic and symptom responses in 68% and 56% of 84 enrolled patients, respectively. MANIFEST-2 (NCT04603495) is a global, Phase 3, randomized, double-blind, active-control study of pelabresib and ruxolitinib versus placebo and ruxolitinib in JAKi treatment-naïve patients with primary MF, post-polycythemia vera MF or post-essential thrombocythemia MF. The aim of this study is to evaluate the efficacy and safety of pelabresib in combination with ruxolitinib. Here we report updates from a recent protocol amendment. The MANIFEST-2 study schema is shown in Figure 1. Key eligibility criteria include a Dynamic International Prognostic Scoring System (DIPSS) score of Intermediate-1 or higher, platelet count ≥100 × 10^9/L, spleen volume ≥450 cc by computerized tomography or magnetic resonance imaging, ≥2 symptoms with an average score ≥3 or a Total Symptom Score (TSS) of ≥10 using the Myelofibrosis Symptom Assessment Form v4.0, peripheral blast count <5% and Eastern Cooperative Oncology Group performance status ≤2. Patient randomization will be stratified by DIPSS risk category (Intermediate-1 vs Intermediate-2 vs High), platelet count (>200 × 10^9/L vs 100–200 × 10^9/L) and spleen volume (≥1800 cm^3 vs <1800 cm^3). Double-blind treatment (pelabresib or matching placebo) will be administered once daily for 14 consecutive days, followed by a 7 day break, which is considered one cycle of treatment. Ruxolitinib will be administered twice daily for all 21 days of the cycle. The primary endpoint is SVR35 response (≥35% reduction in spleen volume from baseline) at Week 24, and the key secondary endpoint is TSS50 response (≥50% reduction in TSS from baseline) at Week 24. Other secondary endpoints include safety, pharmacokinetics, changes in bone marrow fibrosis, duration of SVR35 response, duration of TSS50 response, progression-free survival, overall survival, conversion from transfusion dependence to independence and rate of red blood cell transfusion for the first 24 weeks. Study recruitment is ongoing; 400 patients (200 per arm) from North America, Europe, Asia and Australia will be enrolled. The study opened for enrollment in November 2020. MANIFEST-2 was initiated based on data from the ongoing Phase 2 MANIFEST study with the aim of assessing the efficacy and safety of pelabresib and ruxolitinib in JAKi treatment-naïve patients with MF. MANIFEST-2 is currently open for enrollment.

Keywords: CPI-0610, JAKi treatment-naïve, MANIFEST-2, myelofibrosis, pelabresib

Procedia PDF Downloads 190
24755 The Perspective on Data Collection Instruments for Younger Learners

Authors: Hatice Kübra Koç

Abstract:

For academia, collecting reliable and valid data is one of the most significant issues for researchers. However, it is not the same procedure for all different target groups; meanwhile, during data collection from teenagers, young adults, or adults, researchers can use common data collection tools such as questionnaires, interviews, and semi-structured interviews; yet, for young learners and very young ones, these reliable and valid data collection tools cannot be easily designed or applied by the researchers. In this study, firstly, common data collection tools are examined for ‘very young’ and ‘young learners’ participant groups since it is thought that the quality and efficiency of an academic study is mainly based on its valid and correct data collection and data analysis procedure. Secondly, two different data collection instruments for very young and young learners are stated as discussing the efficacy of them. Finally, a suggested data collection tool – a performance-based questionnaire- which is specifically developed for ‘very young’ and ‘young learners’ participant groups in the field of teaching English to young learners as a foreign language is presented in this current study. The designing procedure and suggested items/factors for the suggested data collection tool are accordingly revealed at the end of the study to help researchers have studied with young and very learners.

Keywords: data collection instruments, performance-based questionnaire, young learners, very young learners

Procedia PDF Downloads 81
24754 Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors

Authors: Yaxin Bi

Abstract:

Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.

Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors

Procedia PDF Downloads 21
24753 Locating the Role of Informal Urbanism in Building Sustainable Cities: Insights from Ghana

Authors: Gideon Abagna Azunre

Abstract:

Informal urbanism is perhaps the most ubiquitous urban phenomenon in sub-Saharan Africa (SSA) and Ghana specifically. Estimates suggest that about two-fifths of urban dwellers (37.9%) in Ghana live in informal settlements, while two-thirds of the working labour force are within the informal economy. This makes Ghana invariably an ‘informal country.’ Informal urbanism involves economic and housing activities that are – in law or in practice – not covered (or insufficiently covered) by formal regulations. Many urban folks rely on informal urbanism as a survival strategy due to limited formal waged employment opportunities or rising home prices in the open market. In an era of globalizing neoliberalism, this struggle to survive in cities resonates with several people globally. For years now, there have been intense debates on the utility of informal urbanism – both its economic and housing dimensions – in developing sustainable cities. While some scholars believe that informal urbanism is beneficial to the sustainable city development agenda, others argue that it generates unbearable negative consequences and it symbolizes lawlessness and squalor. Consequently, the main aim of this research was to dig below the surface of the narratives to locate the role of informal urbanism in the quest for sustainable cities. The research geographically focused on Ghana and its burgeoning informal sector. Also, both primary and secondary data were utilized for the analysis; Secondary data entailed a synthesis of the fragmented literature on informal urbanism in Ghana, while primary data entailed interviews with informal stakeholders (such as informal settlement dwellers), city authorities, and planners. These two data sets were weaved together to discover the nexus between informal urbanism and the tripartite dimensions of sustainable cities – economic, social, and environmental. The results from the research showed a two-pronged relationship between informal urbanism and the three dimensions of sustainable city development. In other words, informal urbanism was identified to both positively and negatively affect the drive for sustainable cities. On the one hand, it provides employment (particularly to women), supplies households’ basic needs (shelter, health, water, and waste management), and enhances civic engagement. However, on the other hand, it perpetuates social and gender inequalities, insecurity, congestion, and pollution. The research revealed that a ‘black and white’ interpretation and policy approach is incapable of capturing the complexities of informal urbanism. Therefore, trying to eradicate or remove it from the urbanscape because it exhibits some negative consequences means cities will lose their positive contributions. The inverse also holds true. A careful balancing act is necessary to maximize the benefits and minimize the costs. Overall, the research presented a de-colonial theorization of informal urbanism and thus followed post-colonial scholars’ clarion call to African cities to embrace the paradox of informality and find ways to integrate it into the city-building process.

Keywords: informal urbanism, sustainable city development, economic sustainability, social sustainability, environmental sustainability, Ghana

Procedia PDF Downloads 99
24752 Generation of Quasi-Measurement Data for On-Line Process Data Analysis

Authors: Hyun-Woo Cho

Abstract:

For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.

Keywords: data analysis, diagnosis, monitoring, process data, quality control

Procedia PDF Downloads 471
24751 Applications of Artificial Intelligence (AI) in Cardiac imaging

Authors: Angelis P. Barlampas

Abstract:

The purpose of this study is to inform the reader, about the various applications of artificial intelligence (AI), in cardiac imaging. AI grows fast and its role is crucial in medical specialties, which use large amounts of digital data, that are very difficult or even impossible to be managed by human beings and especially doctors.Artificial intelligence (AI) refers to the ability of computers to mimic human cognitive function, performing tasks such as learning, problem-solving, and autonomous decision making based on digital data. Whereas AI describes the concept of using computers to mimic human cognitive tasks, machine learning (ML) describes the category of algorithms that enable most current applications described as AI. Some of the current applications of AI in cardiac imaging are the follows: Ultrasound: Automated segmentation of cardiac chambers across five common views and consequently quantify chamber volumes/mass, ascertain ejection fraction and determine longitudinal strain through speckle tracking. Determine the severity of mitral regurgitation (accuracy > 99% for every degree of severity). Identify myocardial infarction. Distinguish between Athlete’s heart and hypertrophic cardiomyopathy, as well as restrictive cardiomyopathy and constrictive pericarditis. Predict all-cause mortality. CT Reduce radiation doses. Calculate the calcium score. Diagnose coronary artery disease (CAD). Predict all-cause 5-year mortality. Predict major cardiovascular events in patients with suspected CAD. MRI Segment of cardiac structures and infarct tissue. Calculate cardiac mass and function parameters. Distinguish between patients with myocardial infarction and control subjects. It could potentially reduce costs since it would preclude the need for gadolinium-enhanced CMR. Predict 4-year survival in patients with pulmonary hypertension. Nuclear Imaging Classify normal and abnormal myocardium in CAD. Detect locations with abnormal myocardium. Predict cardiac death. ML was comparable to or better than two experienced readers in predicting the need for revascularization. AI emerge as a helpful tool in cardiac imaging and for the doctors who can not manage the overall increasing demand, in examinations such as ultrasound, computed tomography, MRI, or nuclear imaging studies.

Keywords: artificial intelligence, cardiac imaging, ultrasound, MRI, CT, nuclear medicine

Procedia PDF Downloads 68
24750 Solar Cell Packed and Insulator Fused Panels for Efficient Cooling in Cubesat and Satellites

Authors: Anand K. Vinu, Vaishnav Vimal, Sasi Gopalan

Abstract:

All spacecraft components have a range of allowable temperatures that must be maintained to meet survival and operational requirements during all mission phases. Due to heat absorption, transfer, and emission on one side, the satellite surface presents an asymmetric temperature distribution and causes a change in momentum, which can manifest in spinning and non-spinning satellites in different manners. This problem can cause orbital decays in satellites which, if not corrected, will interfere with its primary objective. The thermal analysis of any satellite requires data from the power budget for each of the components used. This is because each of the components has different power requirements, and they are used at specific times in an orbit. There are three different cases that are run, one is the worst operational hot case, the other one is the worst non-operational cold case, and finally, the operational cold case. Sunlight is a major source of heating that takes place on the satellite. The way in which it affects the spacecraft depends on the distance from the Sun. Any part of a spacecraft or satellite facing the Sun will absorb heat (a net gain), and any facing away will radiate heat (a net loss). We can use the state-of-the-art foldable hybrid insulator/radiator panel. When the panels are opened, that particular side acts as a radiator for dissipating the heat. Here the insulator, in our case, the aerogel, is sandwiched with solar cells and radiator fins (solar cells outside and radiator fins inside). Each insulated side panel can be opened and closed using actuators depending on the telemetry data of the CubeSat. The opening and closing of the panels are dependent on the special code designed for this particular application, where the computer calculates where the Sun is relative to the satellites. According to the data obtained from the sensors, the computer decides which panel to open and by how many degrees. For example, if the panels open 180 degrees, the solar panels will directly face the Sun, in turn increasing the current generator of that particular panel. One example is when one of the corners of the CubeSat is facing or if more than one side is having a considerable amount of sun rays incident on it. Then the code will analyze the optimum opening angle for each panel and adjust accordingly. Another means of cooling is the passive way of cooling. It is the most suitable system for a CubeSat because of its limited power budget constraints, low mass requirements, and less complex design. Other than this fact, it also has other advantages in terms of reliability and cost. One of the passive means is to make the whole chase act as a heat sink. For this, we can make the entire chase out of heat pipes and connect the heat source to this chase with a thermal strap that transfers the heat to the chassis.

Keywords: passive cooling, CubeSat, efficiency, satellite, stationary satellite

Procedia PDF Downloads 88
24749 Emerging Technology for Business Intelligence Applications

Authors: Hsien-Tsen Wang

Abstract:

Business Intelligence (BI) has long helped organizations make informed decisions based on data-driven insights and gain competitive advantages in the marketplace. In the past two decades, businesses witnessed not only the dramatically increasing volume and heterogeneity of business data but also the emergence of new technologies, such as Artificial Intelligence (AI), Semantic Web (SW), Cloud Computing, and Big Data. It is plausible that the convergence of these technologies would bring more value out of business data by establishing linked data frameworks and connecting in ways that enable advanced analytics and improved data utilization. In this paper, we first review and summarize current BI applications and methodology. Emerging technologies that can be integrated into BI applications are then discussed. Finally, we conclude with a proposed synergy framework that aims at achieving a more flexible, scalable, and intelligent BI solution.

Keywords: business intelligence, artificial intelligence, semantic web, big data, cloud computing

Procedia PDF Downloads 88
24748 The Flood Disaster Management of Communities in Ubon Ratchathani Province, Thailand

Authors: Eakarat Boonreang, Anothai Harasarn

Abstract:

The objectives of this study are to investigate the flood disaster management capacity of communities in Ubon Ratchathani province, Thailand, and to recommend the sustainable flood management approaches of communities in Ubon Ratchathani province, Thailand. The selected population consisted of the community leaders and committees, the executives of local administrative organizations, and the head of Ubon Ratchathani provincial office of disaster prevention and mitigation. The data was collected by in-depth interview, focus group, and observation. The data was analyzed and classified in order to determine the communities’ capacity in flood disaster management. The results revealed that communities’ capacity were as follows, before flood disaster, the community leaders held a meeting with the community committees in order to plan disaster response and determined evacuation routes, and the villagers moved their belongings to higher places and prepared vehicles for evacuation. During flood disaster, the communities arranged motorboats for transportation and villagers evacuated to a temporary evacuation center. Moreover, the communities asked for survival bags, motorboats, emergency toilets, and drinking water from the local administrative organizations and the 22nd Military Circle. After flood disaster, the villagers cleaned and fixed their houses and also collaborated in cleaning the temple, school, and other places in the community. The recommendation approaches for sustainable flood disaster management consisted of structural measures, such as the establishment of reservoirs and building higher houses, and non-structural measures such as raising awareness and fostering self-reliance, establishing disaster management plans, rehearsal of disaster response procedures every year, and transferring disaster knowledge among younger generations. Moreover, local administrative organizations should formulate strategic plans that focus on disaster management capacity building at the community level, particularly regarding non-structural measures. Ubon Ratchathani provincial offices of disaster prevention and mitigation should continually monitor and evaluate the outcomes of community based disaster risk management program, including allocating more flood disaster management-related resources among local administrative organizations and communities.

Keywords: capacity building, community based disaster risk management, flood disaster management, Thailand

Procedia PDF Downloads 158
24747 Using Equipment Telemetry Data for Condition-Based maintenance decisions

Authors: John Q. Todd

Abstract:

Given that modern equipment can provide comprehensive health, status, and error condition data via built-in sensors, maintenance organizations have a new and valuable source of insight to take advantage of. This presentation will expose what these data payloads might look like and how they can be filtered, visualized, calculated into metrics, used for machine learning, and generate alerts for further action.

Keywords: condition based maintenance, equipment data, metrics, alerts

Procedia PDF Downloads 173
24746 Technological Innovation and Efficiency of Production of the Greek Aquaculture Industry

Authors: C. Nathanailides, S. Anastasiou, A. Dimitroglou, P. Logothetis, G. Kanlis

Abstract:

In the present work we reviewed historical data of the Greek Marine aquaculture industry including adoption of new methods and technological innovation. The results indicate that the industry exhibited a rapid rise in production efficiency, employment and adoption of new technologies which reduced outbreaks of diseases, reduced production risk and the price of the farmed fish. The improvements of total quality practices and technological input on the Greek Aquaculture industry include improved survival, growth and body shape of farmed fish, which resulted from development of new aquaculture feeds and the genetic selection of the bloodstock. Also improvements in the quality of the final product were achieved via technological input in the methods and technology applied during harvesting, packaging, and transportation-preservation of farmed fish ensuring high quality of the product from the fish farm to the plate of the consumers. These parameters (health management, nutrition, genetics, harvesting and post-harvesting methods and technology) changed significantly over the last twenty years and the results of these improvements are reflected in the production efficiency of the Aquaculture industry and the quality of the final product. It is concluded that the Greek aquaculture industry exhibited a rapid growth, adoption of technologies and supply was stabilized after the global financial crisis, nevertheless, the development of the Greek aquaculture industry is currently limited by international trade sanctions, credit crunch, and increased taxation and not by limited technology or resources.

Keywords: innovation, aquaculture, total quality, management

Procedia PDF Downloads 366
24745 Ethics Can Enable Open Source Data Research

Authors: Dragana Calic

Abstract:

The openness, availability and the sheer volume of big data have provided, what some regard as, an invaluable and rich dataset. Researchers, businesses, advertising agencies, medical institutions, to name only a few, collect, share, and analyze this data to enable their processes and decision making. However, there are important ethical considerations associated with the use of big data. The rapidly evolving nature of online technologies has overtaken the many legislative, privacy, and ethical frameworks and principles that exist. For example, should we obtain consent to use people’s online data, and under what circumstances can privacy considerations be overridden? Current guidance on how to appropriately and ethically handle big data is inconsistent. Consequently, this paper focuses on two quite distinct but related ethical considerations that are at the core of the use of big data for research purposes. They include empowering the producers of data and empowering researchers who want to study big data. The first consideration focuses on informed consent which is at the core of empowering producers of data. In this paper, we discuss some of the complexities associated with informed consent and consider studies of producers’ perceptions to inform research ethics guidelines and practice. The second consideration focuses on the researcher. Similarly, we explore studies that focus on researchers’ perceptions and experiences.

Keywords: big data, ethics, producers’ perceptions, researchers’ perceptions

Procedia PDF Downloads 278
24744 Hybrid Reliability-Similarity-Based Approach for Supervised Machine Learning

Authors: Walid Cherif

Abstract:

Data mining has, over recent years, seen big advances because of the spread of internet, which generates everyday a tremendous volume of data, and also the immense advances in technologies which facilitate the analysis of these data. In particular, classification techniques are a subdomain of Data Mining which determines in which group each data instance is related within a given dataset. It is used to classify data into different classes according to desired criteria. Generally, a classification technique is either statistical or machine learning. Each type of these techniques has its own limits. Nowadays, current data are becoming increasingly heterogeneous; consequently, current classification techniques are encountering many difficulties. This paper defines new measure functions to quantify the resemblance between instances and then combines them in a new approach which is different from actual algorithms by its reliability computations. Results of the proposed approach exceeded most common classification techniques with an f-measure exceeding 97% on the IRIS Dataset.

Keywords: data mining, knowledge discovery, machine learning, similarity measurement, supervised classification

Procedia PDF Downloads 457
24743 Seismic Data Scaling: Uncertainties, Potential and Applications in Workstation Interpretation

Authors: Ankur Mundhra, Shubhadeep Chakraborty, Y. R. Singh, Vishal Das

Abstract:

Seismic data scaling affects the dynamic range of a data and with present day lower costs of storage and higher reliability of Hard Disk data, scaling is not suggested. However, in dealing with data of different vintages, which perhaps were processed in 16 bits or even 8 bits and are need to be processed with 32 bit available data, scaling is performed. Also, scaling amplifies low amplitude events in deeper region which disappear due to high amplitude shallow events that saturate amplitude scale. We have focused on significance of scaling data to aid interpretation. This study elucidates a proper seismic loading procedure in workstations without using default preset parameters as available in most software suites. Differences and distribution of amplitude values at different depth for seismic data are probed in this exercise. Proper loading parameters are identified and associated steps are explained that needs to be taken care of while loading data. Finally, the exercise interprets the un-certainties which might arise when correlating scaled and unscaled versions of seismic data with synthetics. As, seismic well tie correlates the seismic reflection events with well markers, for our study it is used to identify regions which are enhanced and/or affected by scaling parameter(s).

Keywords: clipping, compression, resolution, seismic scaling

Procedia PDF Downloads 460
24742 Monte Carlo Pathwise Sensitivities for Barrier Options with Application to Coco-Bond Calibration

Authors: Thomas Gerstner, Bastian von Harrach, Daniel Roth

Abstract:

The Monte Carlo pathwise sensitivities approach is well established for smooth payoff functions. In this work, we present a new Monte Carlo algorithm that is able to calculate the pathwise sensitivities for discontinuous payoff functions. Our main tool is the one-step survival idea of Glasserman and Staum. Although this technique yields to new terms per observation, while differentiating, the algorithm is still efficient. As an application, we use the results for a two-dimensional calibration of a Coco-Bond, which we model with different types of discretely monitored barrier options.

Keywords: Monte Carlo, discretely monitored barrier options, pathwise sensitivities, Coco-Bond

Procedia PDF Downloads 345
24741 Design of 100 kW Induction Generator for Wind Power Plant at Tamanjaya Village-Sukabumi

Authors: Andri Setiyoso, Agus Purwadi, Nanda Avianto Wicaksono

Abstract:

This paper present about induction generator design for 100kW power output capacity. Induction machine had been chosen because of the capability for energy conversion from electric energy to mechanical energy and vise-versa with operation on variable speed condition. Stator Controlled Induction Generator (SCIG) was applied as wind power plant in Desa Taman Jaya, Sukabumi, Indonesia. Generator was designed to generate power 100 kW with wind speed at 12 m/s and survival condition at speed 21 m/s.

Keywords: wind energy, induction generator, Stator Controlled Induction Generator (SCIG), variable speed generator

Procedia PDF Downloads 495
24740 A Call for Justice and a New Economic Paradigm: Analyzing Counterhegemonic Discourses for Indigenous Peoples' Rights and Environmental Protection in Philippine Alternative Media

Authors: B. F. Espiritu

Abstract:

This paper examines the resistance of the Lumad people, the indigenous peoples in Mindanao, Southern Philippines, and of environmental and human rights activists to the Philippine government's neoliberal policies and their call for justice and a new economic paradigm that will uphold peoples' rights and environmental protection in two alternative media online sites. The study contributes to the body of knowledge on indigenous resistance to neoliberal globalization and the quest for a new economic paradigm that upholds social justice for the marginalized in society, empathy and compassion for those who depend on the land for their survival, and environmental sustainability. The study analyzes the discourses in selected news articles from Davao Today and Kalikasan (translated to English as 'Nature') People's Network for the Environment’s statements and advocacy articles for the Lumad and the environment from 2018 to February 2020. The study reveals that the alternative media news articles and the advocacy articles contain statements that expose the oppression and violation of human rights of the Lumad people, farmers, government environmental workers, and environmental activists as shown in their killings, illegal arrest and detention, displacement of the indigenous peoples, destruction of their schools by the military and paramilitary groups, and environmental plunder and destruction with the government's permit for the entry and operation of extractive and agribusiness industries in the Lumad ancestral lands. Anchored on Christian Fuch's theory of alternative media as critical media and Bert Cammaerts' theorization of alternative media as counterhegemonic media that are part of civil society and form a third voice between state media and commercial media, the study reveals the counterhegemonic discourses of the news and advocacy articles that oppose the dominant economic system of neoliberalism which oppresses the people who depend on the land for their survival. Furthermore, the news and advocacy articles seek to advance social struggles that transform society towards the realization of cooperative potentials or a new economic paradigm that upholds economic democracy, where the local people, including the indigenous people, are economically empowered their environment and protected towards the realization of self-sustaining communities. The study highlights the call for justice, empathy, and compassion for both the people and the environment and the need for a new economic paradigm wherein indigenous peoples and local communities are empowered towards becoming self-sustaining communities in a sustainable environment.

Keywords: alternative media, environmental sustainability, human rights, indigenous resistance

Procedia PDF Downloads 131
24739 Association of Social Data as a Tool to Support Government Decision Making

Authors: Diego Rodrigues, Marcelo Lisboa, Elismar Batista, Marcos Dias

Abstract:

Based on data on child labor, this work arises questions about how to understand and locate the factors that make up the child labor rates, and which properties are important to analyze these cases. Using data mining techniques to discover valid patterns on Brazilian social databases were evaluated data of child labor in the State of Tocantins (located north of Brazil with a territory of 277000 km2 and comprises 139 counties). This work aims to detect factors that are deterministic for the practice of child labor and their relationships with financial indicators, educational, regional and social, generating information that is not explicit in the government database, thus enabling better monitoring and updating policies for this purpose.

Keywords: social data, government decision making, association of social data, data mining

Procedia PDF Downloads 363
24738 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform

Authors: Sadam Alwadi

Abstract:

Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.

Keywords: outlier values, imputation, stock market data, detecting, estimation

Procedia PDF Downloads 75
24737 The Chemistry in the Video Game No Man’s Sky

Authors: Diogo Santos, Nelson Zagalo, Carla Morais

Abstract:

No Man’s Sky (NMS) is a sci-fi video game about survival and exploration where players fly spaceships, search for elements, and use them to survive. NMS isn’t a serious game, and not all the science in the game is presented with scientific evidence. To find how players felt about the scientific content in the game and how they perceive the chemistry in it, a survey was sent to NMS’s players, from which were collected answers from 124 respondents from 23 countries. Chemophobia is still a phenomenon when chemistry or chemicals are a subject of discussion, but 68,9% of our respondents showed a positive attitude towards the presence of chemistry in NMS, with 57% stating that playing the video game motivated them to know more about science. 8% of the players stated that NMS often prompted conversations about the science in the video game between them and teachers, parents, or friends. These results give us ideas on how an entertainment game can potentially help scientists, educators, and science communicators reach a growing, evolving, vibrant, diverse, and demanding audience.

Keywords: digital games, science communication, chemistry, informal learning, No Man’s Sky

Procedia PDF Downloads 101
24736 PEINS: A Generic Compression Scheme Using Probabilistic Encoding and Irrational Number Storage

Authors: P. Jayashree, S. Rajkumar

Abstract:

With social networks and smart devices generating a multitude of data, effective data management is the need of the hour for networks and cloud applications. Some applications need effective storage while some other applications need effective communication over networks and data reduction comes as a handy solution to meet out both requirements. Most of the data compression techniques are based on data statistics and may result in either lossy or lossless data reductions. Though lossy reductions produce better compression ratios compared to lossless methods, many applications require data accuracy and miniature details to be preserved. A variety of data compression algorithms does exist in the literature for different forms of data like text, image, and multimedia data. In the proposed work, a generic progressive compression algorithm, based on probabilistic encoding, called PEINS is projected as an enhancement over irrational number stored coding technique to cater to storage issues of increasing data volumes as a cost effective solution, which also offers data security as a secondary outcome to some extent. The proposed work reveals cost effectiveness in terms of better compression ratio with no deterioration in compression time.

Keywords: compression ratio, generic compression, irrational number storage, probabilistic encoding

Procedia PDF Downloads 281
24735 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework

Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe

Abstract:

This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.

Keywords: IoT, fog, cloud, data analysis, data privacy

Procedia PDF Downloads 89
24734 Comparison of Selected Pier-Scour Equations for Wide Piers Using Field Data

Authors: Nordila Ahmad, Thamer Mohammad, Bruce W. Melville, Zuliziana Suif

Abstract:

Current methods for predicting local scour at wide bridge piers, were developed on the basis of laboratory studies and very limited scour prediction were tested with field data. Laboratory wide pier scour equation from previous findings with field data were presented. A wide range of field data were used and it consists of both live-bed and clear-water scour. A method for assessing the quality of the data was developed and applied to the data set. Three other wide pier-scour equations from the literature were used to compare the performance of each predictive method. The best-performing scour equation were analyzed using statistical analysis. Comparisons of computed and observed scour depths indicate that the equation from the previous publication produced the smallest discrepancy ratio and RMSE value when compared with the large amount of laboratory and field data.

Keywords: field data, local scour, scour equation, wide piers

Procedia PDF Downloads 396
24733 The Maximum Throughput Analysis of UAV Datalink 802.11b Protocol

Authors: Inkyu Kim, SangMan Moon

Abstract:

This IEEE 802.11b protocol provides up to 11Mbps data rate, whereas aerospace industry wants to seek higher data rate COTS data link system in the UAV. The Total Maximum Throughput (TMT) and delay time are studied on many researchers in the past years This paper provides theoretical data throughput performance of UAV formation flight data link using the existing 802.11b performance theory. We operate the UAV formation flight with more than 30 quad copters with 802.11b protocol. We may be predicting that UAV formation flight numbers have to bound data link protocol performance limitations.

Keywords: UAV datalink, UAV formation flight datalink, UAV WLAN datalink application, UAV IEEE 802.11b datalink application

Procedia PDF Downloads 381
24732 Low SPOP Expression and High MDM2 expression Are Associated with Tumor Progression and Predict Poor Prognosis in Hepatocellular Carcinoma

Authors: Chang Liang, Weizhi Gong, Yan Zhang

Abstract:

Purpose: Hepatocellular carcinoma (HCC) is a malignant tumor with a high mortality rate and poor prognosis worldwide. Murine double minute 2 (MDM2) regulates the tumor suppressor p53, increasing cancer risk and accelerating tumor progression. Speckle-type POX virus and zinc finger protein (SPOP), a key of subunit of Cullin-Ring E3 ligase, inhibits tumor genesis and progression by the ubiquitination of its downstream substrates. This study aimed to clarify whether SPOP and MDM2 are mutually regulated in HCC and the correlation between SPOP and MDM2 and the prognosis of HCC patients. Methods: First, the expression of SPOP and MDM2 in HCC tissues were detected by TCGA database. Then, 53 paired samples of HCC tumor and adjacent tissues were collected to evaluate the expression of SPOP and MDM2 using immunohistochemistry. Chi-square test or Fisher’s exact test were used to analyze the relationship between clinicopathological features and the expression levels of SPOP and MDM2. In addition, Kaplan‒Meier curve analysis and log-rank test were used to investigate the effects of SPOP and MDM2 on the survival of HCC patients. Last, the Multivariate Cox proportional risk regression model analyzed whether the different expression levels of SPOP and MDM2 were independent risk factors for the prognosis of HCC patients. Results: Bioinformatics analysis revealed the low expression of SPOP and high expression of MDM2 were related to worse prognosis of HCC patients. The relationship between the expression of SPOP and MDM2 and tumor stem-like features showed an opposite trend. The immunohistochemistry showed the expression of SPOP protein was significantly downregulated while MDM2 protein significantly upregulated in HCC tissue compared to that in para-cancerous tissue. Tumors with low SPOP expression were related to worse T stage and Barcelona Clinic Liver Cancer (BCLC) stage, but tumors with high MDM2 expression were related to worse T stage, M stage, and BCLC stage. Kaplan–Meier curves showed HCC patients with high SPOP expression and low MDM2 expression had better survival than those with low SPOP expression and high MDM2 expression (P < 0.05). A multivariate Cox proportional risk regression model confirmed that a high MDM2 expression level was an independent risk factor for poor prognosis in HCC patients (P <0.05). Conclusion: The expression of SPOP protein was significantly downregulated, while the expression of MDM2 significantly upregulated in HCC. The low expression of SPOP and high expression. of MDM2 were associated with malignant progression and poor prognosis of HCC patients, indicating a potential therapeutic target for HCC patients.

Keywords: hepatocellular carcinoma, murine double minute 2, speckle-type POX virus and zinc finger protein, ubiquitination

Procedia PDF Downloads 130
24731 Methods for Distinction of Cattle Using Supervised Learning

Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl

Abstract:

Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.

Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning

Procedia PDF Downloads 539
24730 Router 1X3 - RTL Design and Verification

Authors: Nidhi Gopal

Abstract:

Routing is the process of moving a packet of data from source to destination and enables messages to pass from one computer to another and eventually reach the target machine. A router is a networking device that forwards data packets between computer networks. It is connected to two or more data lines from different networks (as opposed to a network switch, which connects data lines from one single network). This paper mainly emphasizes upon the study of router device, its top level architecture, and how various sub-modules of router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top module.

Keywords: data packets, networking, router, routing

Procedia PDF Downloads 793
24729 Infant and Young Child-Feeding Practices in Mongolia

Authors: Otgonjargal Damdinbaljir

Abstract:

Background: Infant feeding practices have a major role in determining the nutritional status of children and are associated with household socioeconomic and demographic factors. In 2010, Mongolia used WHO 2008 edition of Indicators for assessing infant and young child feeding practices for the first time. Objective: To evaluate the feeding status of infants and young children under 2 years old in Mongolia. Materials and Methods: The study was conducted by cluster random sampling. The data on breastfeeding and complementary food supplement of the 350 infants and young children aged 0-23 months in 21 provinces of the 4 economic regions of the country and capital Ulaanbaatar city were collected through questionnaires. The feeding status was analyzed according to the WHO 2008 edition of Indicators for assessing infant and young child feeding practices. Analysis of data: Survey data was analysed using the PASW statistics 18.0 and EPI INFO 2000 software. For calculation of overall measures for the entire survey sample, analyses were stratified by region. Age-specific feeding patterns were described using frequencies, proportions and survival analysis. Logistic regression was done with feeding practice as dependent and socio demographic factors as independent variables. Simple proportions were calculated for each IYCF indicator. The differences in the feeding practices between sexes and age-groups, if any, were noted using chi-square test. Ethics: The Ethics Committee under the auspices of the Ministry of Health approved the study. Results: A total of 350 children aged 0-23 months were investigated. The rate of ever breastfeeding of children aged 0-23 months reached up to 98.2%, while the percentage of early initiation of breastfeeding was only 85.5%. The rates of exclusive breastfeeding under 6 months, continued breastfeeding for 1 year, and continued breastfeeding for 2 years were 71.3%, 74% and 54.6%, respectively. The median time of giving complementary food was the 6th month and the weaning time was the 9th month. The rate of complementary food supplemented from 6th-8th month in time was 80.3%. The rates of minimum dietary diversity, minimum meal frequency, and consumption of iron-rich or iron-fortified foods among children aged 6-23 months were 52.1%, 80.8% (663/813) and 30.1%, respectively. Conclusions: The main problems revealed from the study were inadequate category and frequency of complementary food, and the low rate of consumption of iron-rich or iron-fortified foods were the main issues to be concerned on infant feeding in Mongolia. Our findings have highlighted the need to encourage mothers to enrich their traditional wheat- based complementary foods add more animal source foods and vegetables.

Keywords: complementary feeding, early initiation of breastfeeding, exclusive breastfeeding, minimum meal frequency

Procedia PDF Downloads 466