Search results for: innovation maturity models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8587

Search results for: innovation maturity models

6607 Hydrocarbon Source Rocks of the Maragh Low

Authors: Elhadi Nasr, Ibrahim Ramadan

Abstract:

Biostratigraphical analyses of well sections from the Maragh Low in the Eastern Sirt Basin has allowed high resolution correlations to be undertaken. Full integration of this data with available palaeoenvironmental, lithological, gravity, seismic, aeromagnetic, igneous, radiometric and wireline log information and a geochemical analysis of source rock quality and distribution has led to a more detailed understanding of the geological and the structural history of this area. Pre Sirt Unconformity two superimposed rifting cycles have been identified. The oldest is represented by the Amal Group of sediments and is of Late Carboniferous, Kasimovian / Gzelian to Middle Triassic, Anisian age. Unconformably overlying is a younger rift cycle which is represented the Sarir Group of sediments and is of Early Cretaceous, late Neocomian to Aptian in age. Overlying the Sirt Unconformity is the marine Late Cretaceous section. An assessment of pyrolysis results and a palynofacies analysis has allowed hydrocarbon source facies and quality to be determined. There are a number of hydrocarbon source rock horizons in the Maragh Low, these are sometimes vertically stacked and they are of fair to excellent quality. The oldest identified source rock is the Triassic Shale, this unit is unconformably overlain by sandstones belonging to the Sarir Group and conformably overlies a Triassic Siltstone unit. Palynological dating of the Triassic Shale unit indicates a Middle Triassic, Anisian age. The Triassic Shale is interpreted to have been deposited in a lacustrine palaeoenvironment. This particularly is evidenced by the dark, fine grained, organic rich nature of the sediment and is supported by palynofacies analysis and by the recovery of fish fossils. Geochemical analysis of the Triassic Shale indicates total organic carbon varying between 1.37 and 3.53. S2 pyrolysate yields vary between 2.15 mg/g and 6.61 mg/g and hydrogen indices vary between 156.91 and 278.91. The source quality of the Triassic Shale varies from being of fair to very good / rich. Linked to thermal maturity it is now a very good source for light oil and gas. It was once a very good to rich oil source. The Early Barremian Shale was also deposited in a lacustrine palaeoenvironment. Recovered palynomorphs indicate an Early Cretaceous, late Neocomian to early Barremian age. The Early Barremian Shale is conformably underlain and overlain by sandstone units belonging to the Sarir Group of sediments which are also of Early Cretaceous age. Geochemical analysis of the Early Barremian Shale indicates that it is a good oil source and was originally very good. Total organic carbon varies between 3.59% and 7%. S2 varies between 6.30 mg/g and 10.39 mg/g and the hydrogen indices vary between 148.4 and 175.5. A Late Barremian Shale unit of this age has also been identified in the central Maragh Low. Geochemical analyses indicate that total organic carbon varies between 1.05 and 2.38%, S2 pyrolysate between 1.6 and 5.34 mg/g and the hydrogen index between 152.4 and 224.4. It is a good oil source rock which is now mature. In addition to the non marine hydrocarbon source rocks pre Sirt Unconformity, three formations in the overlying Late Cretaceous section also provide hydrocarbon quality source rocks. Interbedded shales within the Rachmat Formation of Late Cretaceous, early Campanian age have total organic carbon ranging between, 0.7 and 1.47%, S2 pyrolysate varying between 1.37 and 4.00 mg/g and hydrogen indices varying between 195.7 and 272.1. The indication is that this unit would provide a fair gas source to a good oil source. Geochemical analyses of the overlying Tagrifet Limestone indicate that total organic carbon varies between 0.26% and 1.01%. S2 pyrolysate varies between 1.21 and 2.16 mg/g and hydrogen indices vary between 195.7 and 465.4. For the overlying Sirt Shale Formation of Late Cretaceous, late Campanian age, total organic carbon varies between 1.04% and 1.51%, S2 pyrolysate varies between 4.65 mg/g and 6.99 mg/g and the hydrogen indices vary between 151 and 462.9. The study has proven that both the Sirt Shale Formation and the Tagrifet Limestone are good to very good and rich sources for oil in the Maragh Low. High resolution biostratigraphical interpretations have been integrated and calibrated with thermal maturity determinations (Vitrinite Reflectance (%Ro), Spore Colour Index (SCI) and Tmax (ºC) and the determined present day geothermal gradient of 25ºC / Km for the Maragh Low. Interpretation of generated basin modelling profiles allows a detailed prediction of timing of maturation development of these source horizons and leads to a determination of amounts of missing section at major unconformities. From the results the top of the oil window (0.72% Ro) is picked as high as 10,700’ and the base of the oil window (1.35% Ro) assuming a linear trend and by projection is picked as low as 18,000’ in the Maragh Low. For the Triassic Shale the early phase of oil generation was in the Late Palaeocene / Early to Middle Eocene and the main phase of oil generation was in the Middle to Late Eocene. The Early Barremian Shale reached the main phase of oil generation in the Early Oligocene with late generation being reached in the Middle Miocene. For the Rakb Group section (Rachmat Formation, Tagrifet Limestone and Sirt Shale Formation) the early phase of oil generation started in the Late Eocene with the main phase of generation being between the Early Oligocene and the Early Miocene. From studying maturity profiles and from regional considerations it can be predicted that up to 500’ of sediment may have been deposited and eroded by the Sirt Unconformity in the central Maragh Low while up to 2000’ of sediment may have been deposited and then eroded to the south of the trough.

Keywords: Geochemical analysis of the source rocks from wells in Eastern Sirt Basin.

Procedia PDF Downloads 408
6606 Variability Management of Contextual Feature Model in Multi-Software Product Line

Authors: Muhammad Fezan Afzal, Asad Abbas, Imran Khan, Salma Imtiaz

Abstract:

Software Product Line (SPL) paradigm is used for the development of the family of software products that share common and variable features. Feature model is a domain of SPL that consists of common and variable features with predefined relationships and constraints. Multiple SPLs consist of a number of similar common and variable features, such as mobile phones and Tabs. Reusability of common and variable features from the different domains of SPL is a complex task due to the external relationships and constraints of features in the feature model. To increase the reusability of feature model resources from domain engineering, it is required to manage the commonality of features at the level of SPL application development. In this research, we have proposed an approach that combines multiple SPLs into a single domain and converts them to a common feature model. Extracting the common features from different feature models is more effective, less cost and time to market for the application development. For extracting features from multiple SPLs, the proposed framework consists of three steps: 1) find the variation points, 2) find the constraints, and 3) combine the feature models into a single feature model on the basis of variation points and constraints. By using this approach, reusability can increase features from the multiple feature models. The impact of this research is to reduce the development of cost, time to market and increase products of SPL.

Keywords: software product line, feature model, variability management, multi-SPLs

Procedia PDF Downloads 69
6605 Modeling Football Penalty Shootouts: How Improving Individual Performance Affects Team Performance and the Fairness of the ABAB Sequence

Authors: Pablo Enrique Sartor Del Giudice

Abstract:

Penalty shootouts often decide the outcome of important soccer matches. Although usually referred to as ”lotteries”, there is evidence that some national teams and clubs consistently perform better than others. The outcomes are therefore not explained just by mere luck, and therefore there are ways to improve the average performance of players, naturally at the expense of some sort of effort. In this article we study the payoff of player performance improvements in terms of the performance of the team as a whole. To do so we develop an analytical model with static individual performances, as well as Monte Carlo models that take into account the known influence of partial score and round number on individual performances. We find that within a range of usual values, the team performance improves above 70% faster than individual performances do. Using these models, we also estimate that the new ABBA penalty shootout ordering under test reduces almost all the known bias in favor of the first-shooting team under the current ABAB system.

Keywords: football, penalty shootouts, Montecarlo simulation, ABBA

Procedia PDF Downloads 162
6604 Makerspaces as Centers of Innovation: An Assessment of the Impact of Technology Incubation Centers in Nigeria

Authors: Bisi Olawoyin

Abstract:

The idea of knowledge sharing facilitated by the internet and complemented by a collaborative offline process in form of shared workshops called Makerspaces has become an attractive economic development agenda worldwide. Towards this end, Nigeria has established a number of Technology Incubation Centers (TICs) across the country with a view to using them as institutional mechanisms for commercializing Research and Development results; thus helping to promote venture creation and economic development. This study thus examines the impact of the nurturing by the TICs, on the performance of selected incubated enterprises that have grown into medium scale businesses in different sectors of the economy. The objective is to determine the extent to which the process of incubation has contributed to their growth in relation to similar businesses that developed outside the TICs. Six enterprises nurtured by TICs and six others outside, these were selected for the study. Data were collected in respect of the twelve enterprises covering their first five years of operation. Performances in terms of annual turnover, market share, and product range were analysed by scatter diagram plotted to show these variables against time and on comparative basis between TIC and non-TIC enterprises. Results showed an initial decline in performance for most of the incubatees in the first two years due to sluggish adjustment to withdrawal of subsidies enjoyed at the TICs. However, four of them were able to catch up with improved performance and surpass their non–TIC counterparts consistently from the third year. Analysis of year on year performance also showed average growth rate of 7% and 5 % respectively for TIC and non–TIC enterprises. The study, therefore, concludes that TICs have great role to play in nurturing new, innovative businesses but sees the need for government to address the provision of critical facilities especially electricity and utilities that constitute critical cost components for businesses. It must also address the issue of investment grants, loans including the development of technology/industrial parks that will serve to boost business survival.

Keywords: entrepreneurship, incubation, innovation, makerspaces

Procedia PDF Downloads 221
6603 Revolving Ferrofluid Flow in Porous Medium with Rotating Disk

Authors: Paras Ram, Vikas Kumar

Abstract:

The transmission of Malaria with seasonal were studied through the use of mathematical models. The data from the annual number of Malaria cases reported to the Division of Epidemiology, Ministry of Public Health, Thailand during the period 1997-2011 were analyzed. The transmission of Malaria with seasonal was studied by formulating a mathematical model which had been modified to describe different situations encountered in the transmission of Malaria. In our model, the population was separated into two groups: the human and vector groups, and then constructed a system of nonlinear differential equations. Each human group was divided into susceptible, infectious in hot season, infectious in rainy season, infectious in cool season and recovered classes. The vector population was separated into two classes only: susceptible and infectious vectors. The analysis of the models was given by the standard dynamical modeling.

Keywords: ferrofluid, magnetic field, porous medium, rotating disk, Neuringer-Rosensweig Model

Procedia PDF Downloads 421
6602 Contextual Toxicity Detection with Data Augmentation

Authors: Julia Ive, Lucia Specia

Abstract:

Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.

Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing

Procedia PDF Downloads 170
6601 Emancipation through the Inclusion of Civil Society in Contemporary Peacebuilding: A Case Study of Peacebuilding Efforts in Colombia

Authors: D. Romero Espitia

Abstract:

Research on peacebuilding has taken a critical turn into examining the neoliberal and hegemonic conception of peace operations. Alternative peacebuilding models have been analyzed, but the scholarly discussion fails to bring them together or form connections between them. The objective of this paper is to rethink peacebuilding by extracting the positive aspects of the various peacebuilding models, connecting them with the local context, and therefore promote emancipation in contemporary peacebuilding efforts. Moreover, local ownership has been widely labelled as one, if not the core principle necessary for a successful peacebuilding project. Yet, definitions of what constitutes the 'local' remain debated. Through a qualitative review of literature, this paper unpacks the contemporary conception of peacebuilding in nexus with 'local ownership' as manifested through civil society. Using Colombia as a case study, this paper argues that a new peacebuilding framework, one that reconsiders the terms of engagement between international and national actors, is needed in order to foster effective peacebuilding efforts in contested transitional states.

Keywords: civil society, Colombia, emancipation, peacebuilding

Procedia PDF Downloads 134
6600 Modeling Waiting and Service Time for Patients: A Case Study of Matawale Health Centre, Zomba, Malawi

Authors: Moses Aron, Elias Mwakilama, Jimmy Namangale

Abstract:

Spending more time on long queues for a basic service remains a common challenge to most developing countries, including Malawi. For health sector in particular, Out-Patient Department (OPD) experiences long queues. This puts the lives of patients at risk. However, using queuing analysis to under the nature of the problems and efficiency of service systems, such problems can be abated. Based on a kind of service, literature proposes different possible queuing models. However, unlike using generalized assumed models proposed by literature, use of real time case study data can help in deeper understanding the particular problem model and how such a model can vary from one day to the other and also from each case to another. As such, this study uses data obtained from one urban HC for BP, Pediatric and General OPD cases to investigate an average queuing time for patients within the system. It seeks to highlight the proper queuing model by investigating the kind of distributions functions over patient’s arrival time, inter-arrival time, waiting time and service time. Comparable with the standard set values by WHO, the study found that patients at this HC spend more waiting times than service times. On model investigation, different days presented different models ranging from an assumed M/M/1, M/M/2 to M/Er/2. As such, through sensitivity analysis, in general, a commonly assumed M/M/1 model failed to fit the data but rather an M/Er/2 demonstrated to fit well. An M/Er/3 model seemed to be good in terms of measuring resource utilization, proposing a need to increase medical personnel at this HC. However, an M/Er/4 showed to cause more idleness of human resources.

Keywords: health care, out-patient department, queuing model, sensitivity analysis

Procedia PDF Downloads 435
6599 Paraplegic Dimensions of Asymmetric Warfare: A Strategic Analysis for Resilience Policy Plan

Authors: Sehrish Qayyum

Abstract:

In this age of constant technology, asymmetrical warfare could not be won. Attuned psychometric study confirms that screaming sometimes is more productive than active retaliation against strong adversaries. Asymmetric warfare is a game of nerves and thoughts with least vigorous participation for large anticipated losses. It creates the condition of paraplegia with partial but permanent immobility, which effects the core warfare operations, being screams rather than active retaliation. When one’s own power is doubted, it gives power to one’s own doubt to ruin all planning either done with superlative cost-benefit analysis. Strategically calculated estimation of asymmetric warfare since the early WWI to WWII, WWII-to Cold War, and then to the current era in three chronological periods exposits that courage makes nations win the battle of warriors to battle of comrades. Asymmetric warfare has been most difficult to fight and survive due to unexpectedness and being lethal despite preparations. Thoughts before action may be the best-assumed strategy to mix Regional Security Complex Theory and OODA loop to develop the Paraplegic Resilience Policy Plan (PRPP) to win asymmetric warfare. PRPP may serve to control and halt the ongoing wave of terrorism, guerilla warfare, and insurgencies, etc. PRPP, along with a strategic work plan, is based on psychometric analysis to deal with any possible war condition and tactic to save millions of innocent lives such that lost in Christchurch New Zealand in 2019, November 2015 Paris attacks, and Berlin market attacks in 2016, etc. Getting tangled into self-imposed epistemic dilemmas results in regret that becomes the only option of performance. It is a descriptive psychometric analysis of war conditions with generic application of probability tests to find the best possible options and conditions to develop PRPP for any adverse condition possible so far. Innovation in technology begets innovation in planning and action-plan to serve as a rheostat approach to deal with asymmetric warfare.

Keywords: asymmetric warfare, psychometric analysis, PRPP, security

Procedia PDF Downloads 136
6598 Modelling and Simulation Efforts in Scale-Up and Characterization of Semi-Solid Dosage Forms

Authors: Saurav S. Rath, Birendra K. David

Abstract:

Generic pharmaceutical industry has to operate in strict timelines of product development and scale-up from lab to plant. Hence, detailed product & process understanding and implementation of appropriate mechanistic modelling and Quality-by-design (QbD) approaches are imperative in the product life cycle. This work provides example cases of such efforts in topical dosage products. Topical products are typically in the form of emulsions, gels, thick suspensions or even simple solutions. The efficacy of such products is determined by characteristics like rheology and morphology. Defining, and scaling up the right manufacturing process with a given set of ingredients, to achieve the right product characteristics presents as a challenge to the process engineer. For example, the non-Newtonian rheology varies not only with CPPs and CMAs but also is an implicit function of globule size (CQA). Hence, this calls for various mechanistic models, to help predict the product behaviour. This paper focusses on such models obtained from computational fluid dynamics (CFD) coupled with population balance modelling (PBM) and constitutive models (like shear, energy density). In a special case of the use of high shear homogenisers (HSHs) for the manufacture of thick emulsions/gels, this work presents some findings on (i) scale-up algorithm for HSH using shear strain, a novel scale-up parameter for estimating mixing parameters, (ii) non-linear relationship between viscosity and shear imparted into the system, (iii) effect of hold time on rheology of product. Specific examples of how this approach enabled scale-up across 1L, 10L, 200L, 500L and 1000L scales will be discussed.

Keywords: computational fluid dynamics, morphology, quality-by-design, rheology

Procedia PDF Downloads 269
6597 Forecasting Stock Indexes Using Bayesian Additive Regression Tree

Authors: Darren Zou

Abstract:

Forecasting the stock market is a very challenging task. Various economic indicators such as GDP, exchange rates, interest rates, and unemployment have a substantial impact on the stock market. Time series models are the traditional methods used to predict stock market changes. In this paper, a machine learning method, Bayesian Additive Regression Tree (BART) is used in predicting stock market indexes based on multiple economic indicators. BART can be used to model heterogeneous treatment effects, and thereby works well when models are misspecified. It also has the capability to handle non-linear main effects and multi-way interactions without much input from financial analysts. In this research, BART is proposed to provide a reliable prediction on day-to-day stock market activities. By comparing the analysis results from BART and with time series method, BART can perform well and has better prediction capability than the traditional methods.

Keywords: BART, Bayesian, predict, stock

Procedia PDF Downloads 130
6596 Psychological Capital and Intention for Self-Employment among Students in HEIs: A Multi-group Analysis Approach

Authors: Ugur Choban, Aruzhan Zhaksylyk, Assylbek Nurgabdeshov

Abstract:

In recent years, there has been an increasing understanding of the value of encouraging entrepreneurial attitudes in university students. This is motivated by the belief that stimulating entrepreneurship not only promotes economic growth but also fosters innovation. This study looks at the complex link and addresses critical gaps between psychological capital and entrepreneurial intention among university students, with a specific emphasis on how contextual factors like academic support and past business experience impact this dynamic. Using a quantitative research method, data were gathered from a broad sample of 300 university students drawn from several faculties. The study used a questionnaire that included the Psychological Capital Questionnaire (PCQ) to assess psychological capital and a validated scale for entrepreneurial intention, as well as binary measures of academic support and prior entrepreneurial experience. Statistical investigations, including multigroup analyses performed with SmartPLS software, provided interesting insights into the effect of contextual factors on the relationship between psychological capital and entrepreneurial intention. The findings highlight that psychological capital had a strong favorable influence on university students' entrepreneurial inclinations. Furthermore, the study found that academic support enhances the influence of psychological capital on entrepreneurial intentions, emphasizing the significance of institutional backing in fostering entrepreneurial mindsets. Furthermore, students with prior entrepreneurial experience had a stronger propensity for entrepreneurship, showing a synergistic link between psychological capital and entrepreneurial background. These findings have both theoretical and practical implications. By explaining the mechanisms by which psychological capital promotes entrepreneurial intentions, the study contributes to the establishment of focused entrepreneurship education programs and support activities that are suited to student requirements. Policymakers may use these findings to create policies that encourage student entrepreneurship, ultimately encouraging economic development and innovation.

Keywords: academic support, entrepreneurial intentions, higher education institutions, psychological capital, prior entrepreneurial experience

Procedia PDF Downloads 56
6595 Effect of Realistic Lubricant Properties on Thermal Electrohydrodynamic Lubrication Behavior in Circular Contacts

Authors: Puneet Katyal, Punit Kumar

Abstract:

A great deal of efforts has been done in the field of thermal effects in electrohydrodynamic lubrication (TEHL) during the last five decades. The focus was primarily on the development of an efficient numerical scheme to deal with the computational challenges involved in the solution of TEHL model; however, some important aspects related to the accurate description of lubricant properties such as viscosity, rheology and thermal conductivity in EHL point contact analysis remain largely neglected. A few studies available in this regard are based upon highly complex mathematical models difficult to formulate and execute. Using a simplified thermal EHL model for point contacts, this work sheds some light on the importance of accurate characterization of the lubricant properties and demonstrates that the computed TEHL characteristics are highly sensitive to lubricant properties. It also emphasizes the use of appropriate mathematical models with experimentally determined parameters to account for correct lubricant behaviour.

Keywords: TEHL, shear thinning, rheology, conductivity

Procedia PDF Downloads 200
6594 Orthogonal Metal Cutting Simulation of Steel AISI 1045 via Smoothed Particle Hydrodynamic Method

Authors: Seyed Hamed Hashemi Sohi, Gerald Jo Denoga

Abstract:

Machining or metal cutting is one of the most widely used production processes in industry. The quality of the process and the resulting machined product depends on parameters like tool geometry, material, and cutting conditions. However, the relationships of these parameters to the cutting process are often based mostly on empirical knowledge. In this study, computer modeling and simulation using LS-DYNA software and a Smoothed Particle Hydrodynamic (SPH) methodology, was performed on the orthogonal metal cutting process to analyze three-dimensional deformation of AISI 1045 medium carbon steel during machining. The simulation was performed using the following constitutive models: the Power Law model, the Johnson-Cook model, and the Zerilli-Armstrong models (Z-A). The outcomes were compared against the simulated results obtained by Cenk Kiliçaslan using the Finite Element Method (FEM) and the empirical results of Jaspers and Filice. The analysis shows that the SPH method combined with the Zerilli-Armstrong constitutive model is a viable alternative to simulating the metal cutting process. The tangential force was overestimated by 7%, and the normal force was underestimated by 16% when compared with empirical values. The simulation values for flow stress versus strain at various temperatures were also validated against empirical values. The SPH method using the Z-A model has also proven to be robust against issues of time-scaling. Experimental work was also done to investigate the effects of friction, rake angle and tool tip radius on the simulation.

Keywords: metal cutting, smoothed particle hydrodynamics, constitutive models, experimental, cutting forces analyses

Procedia PDF Downloads 261
6593 Intellectual Property Rights and Health Rights: A Feasible Reform Proposal to Facilitate Access to Drugs in Developing Countries

Authors: M. G. Cattaneo

Abstract:

The non-effectiveness of certain codified human rights is particularly apparent with reference to the lack of access to essential drugs in developing countries, which represents a breach of the human right to receive adequate health assistance. This paper underlines the conflict and the legal contradictions between human rights, namely health rights, international Intellectual Property Rights, in particular patent law, as well as international trade law. The paper discusses the crucial links between R&D costs for innovation, patents and new medical drugs, with the goal of reformulating the hierarchies of priorities and of interests at stake in the international intellectual property (IP) law system. Different from what happens today, International patent law should be a legal instrument apt at rebalancing an axiological asymmetry between the (conflicting) needs at stake The core argument in the paper is the proposal of an alternative pathway, namely a feasible proposal for a patent law reform. IP laws tend to balance the benefits deriving from innovation with the costs of the provided monopoly, but since developing countries and industrialized countries are in completely different political and economic situations, it is necessary to (re)modulate such exchange according to the different needs. Based on this critical analysis, the paper puts forward a proposal, called Trading Time for Space (TTS), whereby a longer time for patent exclusive life in western countries (Time) is offered to the patent holder company, in exchange for the latter selling the medical drug at cost price in developing countries (Space). Accordingly, pharmaceutical companies should sell drugs in developing countries at the cost price, or alternatively grant a free license for the sale in such countries, without any royalties or fees. However, such social service shall be duly compensated. Therefore, the consideration for such a service shall be an extension of the temporal duration of the patent’s exclusive in the country of origin that will compensate the reduced profits caused by the supply at the price cost in developing countries.

Keywords: global health, global justice, patent law reform, access to drugs

Procedia PDF Downloads 246
6592 Short Life Cycle Time Series Forecasting

Authors: Shalaka Kadam, Dinesh Apte, Sagar Mainkar

Abstract:

The life cycle of products is becoming shorter and shorter due to increased competition in market, shorter product development time and increased product diversity. Short life cycles are normal in retail industry, style business, entertainment media, and telecom and semiconductor industry. The subject of accurate forecasting for demand of short lifecycle products is of special enthusiasm for many researchers and organizations. Due to short life cycle of products the amount of historical data that is available for forecasting is very minimal or even absent when new or modified products are launched in market. The companies dealing with such products want to increase the accuracy in demand forecasting so that they can utilize the full potential of the market at the same time do not oversupply. This provides the challenge to develop a forecasting model that can forecast accurately while handling large variations in data and consider the complex relationships between various parameters of data. Many statistical models have been proposed in literature for forecasting time series data. Traditional time series forecasting models do not work well for short life cycles due to lack of historical data. Also artificial neural networks (ANN) models are very time consuming to perform forecasting. We have studied the existing models that are used for forecasting and their limitations. This work proposes an effective and powerful forecasting approach for short life cycle time series forecasting. We have proposed an approach which takes into consideration different scenarios related to data availability for short lifecycle products. We then suggest a methodology which combines statistical analysis with structured judgement. Also the defined approach can be applied across domains. We then describe the method of creating a profile from analogous products. This profile can then be used for forecasting products with historical data of analogous products. We have designed an application which combines data, analytics and domain knowledge using point-and-click technology. The forecasting results generated are compared using MAPE, MSE and RMSE error scores. Conclusion: Based on the results it is observed that no one approach is sufficient for short life-cycle forecasting and we need to combine two or more approaches for achieving the desired accuracy.

Keywords: forecast, short life cycle product, structured judgement, time series

Procedia PDF Downloads 358
6591 Incorporating Lexical-Semantic Knowledge into Convolutional Neural Network Framework for Pediatric Disease Diagnosis

Authors: Xiaocong Liu, Huazhen Wang, Ting He, Xiaozheng Li, Weihan Zhang, Jian Chen

Abstract:

The utilization of electronic medical record (EMR) data to establish the disease diagnosis model has become an important research content of biomedical informatics. Deep learning can automatically extract features from the massive data, which brings about breakthroughs in the study of EMR data. The challenge is that deep learning lacks semantic knowledge, which leads to impracticability in medical science. This research proposes a method of incorporating lexical-semantic knowledge from abundant entities into a convolutional neural network (CNN) framework for pediatric disease diagnosis. Firstly, medical terms are vectorized into Lexical Semantic Vectors (LSV), which are concatenated with the embedded word vectors of word2vec to enrich the feature representation. Secondly, the semantic distribution of medical terms serves as Semantic Decision Guide (SDG) for the optimization of deep learning models. The study evaluate the performance of LSV-SDG-CNN model on four kinds of Chinese EMR datasets. Additionally, CNN, LSV-CNN, and SDG-CNN are designed as baseline models for comparison. The experimental results show that LSV-SDG-CNN model outperforms baseline models on four kinds of Chinese EMR datasets. The best configuration of the model yielded an F1 score of 86.20%. The results clearly demonstrate that CNN has been effectively guided and optimized by lexical-semantic knowledge, and LSV-SDG-CNN model improves the disease classification accuracy with a clear margin.

Keywords: convolutional neural network, electronic medical record, feature representation, lexical semantics, semantic decision

Procedia PDF Downloads 126
6590 Introduction of a Multimodal Intervention for People with Autism: 'ReAttach'

Authors: P. Weerkamp Bartholomeus

Abstract:

Autism treatment evaluation is crucial for monitoring the development of an intervention at an early stage. ‘ReAttach’ is a new intervention based on the principles of attachment and social cognitive training. Practical research suggests promising results on a variety of developmental areas. Five years after the first ReAttach sessions these findings can be extended with qualitative research by means of follow-up interviews. The potential impact of this treatment on daily life functioning and well-being of autistic persons becomes clear.

Keywords: autism, innovation, treatment, social cognitive training

Procedia PDF Downloads 291
6589 Life Prediction Method of Lithium-Ion Battery Based on Grey Support Vector Machines

Authors: Xiaogang Li, Jieqiong Miao

Abstract:

As for the problem of the grey forecasting model prediction accuracy is low, an improved grey prediction model is put forward. Firstly, use trigonometric function transform the original data sequence in order to improve the smoothness of data , this model called SGM( smoothness of grey prediction model), then combine the improved grey model with support vector machine , and put forward the grey support vector machine model (SGM - SVM).Before the establishment of the model, we use trigonometric functions and accumulation generation operation preprocessing data in order to enhance the smoothness of the data and weaken the randomness of the data, then use support vector machine (SVM) to establish a prediction model for pre-processed data and select model parameters using genetic algorithms to obtain the optimum value of the global search. Finally, restore data through the "regressive generate" operation to get forecasting data. In order to prove that the SGM-SVM model is superior to other models, we select the battery life data from calce. The presented model is used to predict life of battery and the predicted result was compared with that of grey model and support vector machines.For a more intuitive comparison of the three models, this paper presents root mean square error of this three different models .The results show that the effect of grey support vector machine (SGM-SVM) to predict life is optimal, and the root mean square error is only 3.18%. Keywords: grey forecasting model, trigonometric function, support vector machine, genetic algorithms, root mean square error

Keywords: Grey prediction model, trigonometric functions, support vector machines, genetic algorithms, root mean square error

Procedia PDF Downloads 461
6588 A Study on the New Weapon Requirements Analytics Using Simulations and Big Data

Authors: Won Il Jung, Gene Lee, Luis Rabelo

Abstract:

Since many weapon systems are getting more complex and diverse, various problems occur in terms of the acquisition cost, time, and performance limitation. As a matter of fact, the experiment execution in real world is costly, dangerous, and time-consuming to obtain Required Operational Characteristics (ROC) for a new weapon acquisition although enhancing the fidelity of experiment results. Also, until presently most of the research contained a large amount of assumptions so therefore a bias is present in the experiment results. At this moment, the new methodology is proposed to solve these problems without a variety of assumptions. ROC of the new weapon system is developed through the new methodology, which is a way to analyze big data generated by simulating various scenarios based on virtual and constructive models which are involving 6 Degrees of Freedom (6DoF). The new methodology enables us to identify unbiased ROC on new weapons by reducing assumptions and provide support in terms of the optimal weapon systems acquisition.

Keywords: big data, required operational characteristics (ROC), virtual and constructive models, weapon acquisition

Procedia PDF Downloads 289
6587 Gravitational Frequency Shifts for Photons and Particles

Authors: Jing-Gang Xie

Abstract:

The research, in this case, considers the integration of the Quantum Field Theory and the General Relativity Theory. As two successful models in explaining behaviors of particles, they are incompatible since they work at different masses and scales of energy, with the evidence that regards the description of black holes and universe formation. It is so considering previous efforts in merging the two theories, including the likes of the String Theory, Quantum Gravity models, and others. In a bid to prove an actionable experiment, the paper’s approach starts with the derivations of the existing theories at present. It goes on to test the derivations by applying the same initial assumptions, coupled with several deviations. The resulting equations get similar results to those of classical Newton model, quantum mechanics, and general relativity as long as conditions are normal. However, outcomes are different when conditions are extreme, specifically with no breakdowns even for less than Schwarzschild radius, or at Planck length cases. Even so, it proves the possibilities of integrating the two theories.

Keywords: general relativity theory, particles, photons, Quantum Gravity Model, gravitational frequency shift

Procedia PDF Downloads 359
6586 Cost and Benefits of Collocation in the Use of Biogas to Reduce Vulnerabilities and Risks

Authors: Janaina Camile Pasqual Lofhagen, David Savarese, Veronika Vazhnik

Abstract:

The urgency of the climate crisis requires both innovation and practicality. The energy transition framework allows industry to deliver resilient cities, enhance adaptability to change, pursue energy objectives such as growth or efficiencies, and increase renewable energy. This paper investigates a real-world application perspective for the use of biogas in Brazil and the U.S.. It will examine interventions to provide a foundation of infrastructure, as well as the tangible benefits for policy-makers crafting law and providing incentives.

Keywords: resilience, vulnerability, risks, biogas, sustainability.

Procedia PDF Downloads 105
6585 Promoting Biofuels in India: Assessing Land Use Shifts Using Econometric Acreage Response Models

Authors: Y. Bhatt, N. Ghosh, N. Tiwari

Abstract:

Acreage response function are modeled taking account of expected harvest prices, weather related variables and other non-price variables allowing for partial adjustment possibility. At the outset, based on the literature on price expectation formation, we explored suitable formulations for estimating the farmer’s expected prices. Assuming that farmers form expectations rationally, the prices of food and biofuel crops are modeled using time-series methods for possible ARCH/GARCH effects to account for volatility. The prices projected on the basis of the models are then inserted to proxy for the expected prices in the acreage response functions. Food crop acreages in different growing states are found sensitive to their prices relative to those of one or more of the biofuel crops considered. The required percentage improvement in food crop yields is worked to offset the acreage loss.

Keywords: acreage response function, biofuel, food security, sustainable development

Procedia PDF Downloads 301
6584 The Use of Empirical Models to Estimate Soil Erosion in Arid Ecosystems and the Importance of Native Vegetation

Authors: Meshal M. Abdullah, Rusty A. Feagin, Layla Musawi

Abstract:

When humans mismanage arid landscapes, soil erosion can become a primary mechanism that leads to desertification. This study focuses on applying soil erosion models to a disturbed landscape in Umm Nigga, Kuwait, and identifying its predicted change under restoration plans, The northern portion of Umm Nigga, containing both coastal and desert ecosystems, falls within the boundaries of the Demilitarized Zone (DMZ) adjacent to Iraq, and has been fenced off to restrict public access since 1994. The central objective of this project was to utilize GIS and remote sensing to compare the MPSIAC (Modified Pacific South West Inter Agency Committee), EMP (Erosion Potential Method), and USLE (Universal Soil Loss Equation) soil erosion models and determine their applicability for arid regions such as Kuwait. Spatial analysis was used to develop the necessary datasets for factors such as soil characteristics, vegetation cover, runoff, climate, and topography. Results showed that the MPSIAC and EMP models produced a similar spatial distribution of erosion, though the MPSIAC had more variability. For the MPSIAC model, approximately 45% of the land surface ranged from moderate to high soil loss, while 35% ranged from moderate to high for the EMP model. The USLE model had contrasting results and a different spatial distribution of the soil loss, with 25% of area ranging from moderate to high erosion, and 75% ranging from low to very low. We concluded that MPSIAC and EMP were the most suitable models for arid regions in general, with the MPSIAC model best. We then applied the MPSIAC model to identify the amount of soil loss between coastal and desert areas, and fenced and unfenced sites. In the desert area, soil loss was different between fenced and unfenced sites. In these desert fenced sites, 88% of the surface was covered with vegetation and soil loss was very low, while at the desert unfenced sites it was 3% and correspondingly higher. In the coastal areas, the amount of soil loss was nearly similar between fenced and unfenced sites. These results implied that vegetation cover played an important role in reducing soil erosion, and that fencing is much more important in the desert ecosystems to protect against overgrazing. When applying the MPSIAC model predictively, we found that vegetation cover could be increased from 3% to 37% in unfenced areas, and soil erosion could then decrease by 39%. We conclude that the MPSIAC model is best to predict soil erosion for arid regions such as Kuwait.

Keywords: soil erosion, GIS, modified pacific South west inter agency committee model (MPSIAC), erosion potential method (EMP), Universal soil loss equation (USLE)

Procedia PDF Downloads 297
6583 Removal of Heavy Metal from Wastewater using Bio-Adsorbent

Authors: Rakesh Namdeti

Abstract:

The liquid waste-wastewater- is essentially the water supply of the community after it has been used in a variety of applications. In recent years, heavy metal concentrations, besides other pollutants, have increased to reach dangerous levels for the living environment in many regions. Among the heavy metals, Lead has the most damaging effects on human health. It can enter the human body through the uptake of food (65%), water (20%), and air (15%). In this background, certain low-cost and easily available biosorbent was used and reported in this study. The scope of the present study is to remove Lead from its aqueous solution using Olea EuropaeaResin as biosorbent. The results showed that the biosorption capacity of Olea EuropaeaResin biosorbent was more for Lead removal. The Langmuir, Freundlich, Tempkin, and Dubinin-Radushkevich (D-R) models were used to describe the biosorption equilibrium of Lead Olea EuropaeaResin biosorbent, and the biosorption followed the Langmuir isotherm. The kinetic models showed that the pseudo-second-order rate expression was found to represent well the biosorption data for the biosorbent.

Keywords: novel biosorbent, central composite design, Lead, isotherms, kinetics

Procedia PDF Downloads 78
6582 Refitting Equations for Peak Ground Acceleration in Light of the PF-L Database

Authors: Matevž Breška, Iztok Peruš, Vlado Stankovski

Abstract:

Systematic overview of existing Ground Motion Prediction Equations (GMPEs) has been published by Douglas. The number of earthquake recordings that have been used for fitting these equations has increased in the past decades. The current PF-L database contains 3550 recordings. Since the GMPEs frequently model the peak ground acceleration (PGA) the goal of the present study was to refit a selection of 44 of the existing equation models for PGA in light of the latest data. The algorithm Levenberg-Marquardt was used for fitting the coefficients of the equations and the results are evaluated both quantitatively by presenting the root mean squared error (RMSE) and qualitatively by drawing graphs of the five best fitted equations. The RMSE was found to be as low as 0.08 for the best equation models. The newly estimated coefficients vary from the values published in the original works.

Keywords: Ground Motion Prediction Equations, Levenberg-Marquardt algorithm, refitting PF-L database, peak ground acceleration

Procedia PDF Downloads 462
6581 Finite Element Modeling Techniques of Concrete in Steel and Concrete Composite Members

Authors: J. Bartus, J. Odrobinak

Abstract:

The paper presents a nonlinear analysis 3D model of composite steel and concrete beams with web openings using the Finite Element Method (FEM). The core of the study is the introduction of basic modeling techniques comprehending the description of material behavior, appropriate elements selection, and recommendations for overcoming problems with convergence. Results from various finite element models are compared in the study. The main objective is to observe the concrete failure mechanism and its influence on the structural performance of numerical models of the beams at particular load stages. The bearing capacity of beams, corresponding deformations, stresses, strains, and fracture patterns were determined. The results show how load-bearing elements consisting of concrete parts can be analyzed using FEM software with various options to create the most suitable numerical model. The paper demonstrates the versatility of Ansys software usage for structural simulations.

Keywords: Ansys, concrete, modeling, steel

Procedia PDF Downloads 121
6580 Generalization of Zhou Fixed Point Theorem

Authors: Yu Lu

Abstract:

Fixed point theory is a basic tool for the study of the existence of Nash equilibria in game theory. This paper presents a significant generalization of the Veinott-Zhou fixed point theorem for increasing correspondences, which serves as an essential framework for investigating the existence of Nash equilibria in supermodular and quasisupermodular games. To establish our proofs, we explore different conceptions of multivalued increasingness and provide comprehensive results concerning the existence of the largest/least fixed point. We provide two distinct approaches to the proof, each offering unique insights and advantages. These advancements not only extend the applicability of the Veinott-Zhou theorem to a broader range of economic scenarios but also enhance the theoretical framework for analyzing equilibrium behavior in complex game-theoretic models. Our findings pave the way for future research in the development of more sophisticated models of economic behavior and strategic interaction.

Keywords: fixed-point, Tarski’s fixed-point theorem, Nash equilibrium, supermodular game

Procedia PDF Downloads 55
6579 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad S. Daba, J. P. Dubois

Abstract:

Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.

Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution

Procedia PDF Downloads 372
6578 Cirrhosis Mortality Prediction as Classification using Frequent Subgraph Mining

Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride

Abstract:

In this work, we use machine learning and novel data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. To the best of our knowledge, this is the first work to apply modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.

Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning

Procedia PDF Downloads 134