Search results for: target company
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4260

Search results for: target company

1980 Smart Product-Service System Innovation with User Experience: A Case Study of Chunmi

Authors: Ying Yu, Wen-Chi Kuo, Tung-Jung Sung

Abstract:

The Product-Service System (PSS) has received widespread attention due to the increasing global competition in manufacturing and service markets. Today’s smart products and services are driven by Internet of things (IoT) technologies which will promote the transformation from traditional PSS to smart PSS. Although the smart PSS has some of technological achievements in businesses, it often ignores the real demands of target users when using products and services. Therefore, designers should know and learn the User Experience (UX) of smart products, services and systems. However, both of academia and industry still lack relevant development experience of smart PSS since it is an emerging field. In doing so, this is a case study of Xiaomi’s Chunmi, the largest IoT platform in the world, and addresses the two major issues: (1) why Chunmi should develop smart PSS strategies with UX; and (2) how Chunmi could successfully implement the strategic objectives of smart PSS through the design. The case study results indicated that: (1) the smart PSS can distinguish competitors by their unique UX which is difficult to duplicate; (2) early user engagement is crucial for the success of smart PSS; and (3) interaction, expectation, and enjoyment can be treated as a three-dimensional evaluation of UX design for smart PSS innovation. In conclusion, the smart PSS can gain competitive advantages through good UX design in the market.

Keywords: design, smart PSS, user experience, user engagement

Procedia PDF Downloads 136
1979 The Ductile Fracture of Armor Steel Targets Subjected to Ballistic Impact and Perforation: Calibration of Four Damage Criteria

Authors: Imen Asma Mbarek, Alexis Rusinek, Etienne Petit, Guy Sutter, Gautier List

Abstract:

Over the past two decades, the automotive, aerospace and army industries have been paying an increasing attention to Finite Elements (FE) numerical simulations of the fracture process of their structures. Thanks to the numerical simulations, it is nowadays possible to analyze several problems involving costly and dangerous extreme loadings safely and at a reduced cost such as blast or ballistic impact problems. The present paper is concerned with ballistic impact and perforation problems involving ductile fracture of thin armor steel targets. The target fracture process depends usually on various parameters: the projectile nose shape, the target thickness and its mechanical properties as well as the impact conditions (friction, oblique/normal impact...). In this work, the investigations are concerned with the normal impact of a conical head-shaped projectile on thin armor steel targets. The main aim is to establish a comparative study of four fracture criteria that are commonly used in the fracture process simulations of structures subjected to extreme loadings such as ballistic impact and perforation. Usually, the damage initiation results from a complex physical process that occurs at the micromechanical scale. On a macro scale and according to the following fracture models, the variables on which the fracture depends are mainly the stress triaxiality ƞ, the strain rate, temperature T, and eventually the Lode angle parameter Ɵ. The four failure criteria are: the critical strain to failure model, the Johnson-Cook model, the Wierzbicki model and the Modified Hosford-Coulomb model MHC. Using the SEM, the observations of the fracture facies of tension specimen and of armor steel targets impacted at low and high incident velocities show that the fracture of the specimens is a ductile fracture. The failure mode of the targets is petalling with crack propagation and the fracture facies are covered with micro-cavities. The parameters of each ductile fracture model have been identified for three armor steels and the applicability of each criterion was evaluated using experimental investigations coupled to numerical simulations. Two loading paths were investigated in this study, under a wide range of strain rates. Namely, quasi-static and intermediate uniaxial tension and quasi-static and dynamic double shear testing allow covering various values of stress triaxiality ƞ and of the Lode angle parameter Ɵ. All experiments were conducted on three different armor steel specimen under quasi-static strain rates ranging from 10-4 to 10-1 1/s and at three different temperatures ranging from 297K to 500K, allowing drawing the influence of temperature on the fracture process. Intermediate tension testing was coupled to dynamic double shear experiments conducted on the Hopkinson tube device, allowing to spot the effect of high strain rate on the damage evolution and the crack propagation. The aforementioned fracture criteria are implemented into the FE code ABAQUS via VUMAT subroutine and they were coupled to suitable constitutive relations allow having reliable results of ballistic impact problems simulation. The calibration of the four damage criteria as well as a concise evaluation of the applicability of each criterion are detailed in this work.

Keywords: armor steels, ballistic impact, damage criteria, ductile fracture, SEM

Procedia PDF Downloads 313
1978 Does Indian Intellectual Property Policy Affect the U. S. Pharmaceutical Industry? A Comparative Study of Pfizer and Ranbaxy Laboratories in Regards to Trade Related Aspects of Intellectual Property Rights

Authors: Alina Hamid Bari

Abstract:

Intellectual Property (IP) policies of a country have a huge impact on the pharmaceutical industry as this industry is all about patents. Developed countries have used IP protection to boost their economy; developing countries are concerned about access to medicine for poor people. U.S. company, Pfizer had a monopoly for 14 years for Lipitor and it all came to end when Pfizer decided to operate in India. This research will focus at the effects of Indian IP policies on USA by comparing Pfizer & Ranbaxy with regards to Trade Related Aspects of Intellectual Property Rights. For this research inductive approach has been used. Main source of material is Annual reports, theory based on academic books and articles along with rulings of court, policy statements and decisions, websites and newspaper articles. SWOT analysis is done for both Pfizer & Ranbaxy. The main comparison was done by doing ratio analysis and analyses of annual reports for the year 2011-2012 for Pfizer and Ranbaxy to see the impact on their profitability. This research concludes that Indian intellectual laws do affect the profitability of the U.S. pharmaceutical industry which can in turn have an impact on the US economy. These days India is only granting patents on products which it feels are deserving of it. So the U.S. companies operating in India have to defend their invention to get a patent. Thus, to operate in India and maintain monopoly in market, US firms have to come up with different strategies.

Keywords: atorvastatin, India, intellectual property, lipitor, Pfizer, pharmaceutical industry, Ranbaxy, TRIPs, U.S.

Procedia PDF Downloads 474
1977 Evaluation of Tumor Microenvironment Using Molecular Imaging

Authors: Fakhrosadat Sajjadian, Ramin Ghasemi Shayan

Abstract:

The tumor microenvironment plays an fundamental part in tumor start, movement, metastasis, and treatment resistance. It varies from ordinary tissue in terms of its extracellular network, vascular and lymphatic arrange, as well as physiological conditions. The clinical application of atomic cancer imaging is regularly prevented by the tall commercialization costs of focused on imaging operators as well as the constrained clinical applications and little showcase measure of a few operators. . Since numerous cancer types share comparable characteristics of the tumor microenvironment, the capacity to target these biomarkers has the potential to supply clinically translatable atomic imaging advances for numerous types encompassing cancer and broad clinical applications. Noteworthy advance has been made in focusing on the tumor microenvironment for atomic cancer imaging. In this survey, we summarize the standards and methodologies of later progresses in atomic imaging of the tumor microenvironment, utilizing distinctive imaging modalities for early discovery and conclusion of cancer. To conclude, The tumor microenvironment (TME) encompassing tumor cells could be a profoundly energetic and heterogeneous composition of safe cells, fibroblasts, forerunner cells, endothelial cells, flagging atoms and extracellular network (ECM) components.

Keywords: molecular, imaging, TME, medicine

Procedia PDF Downloads 45
1976 Biological Activity of Mesenchymal Stem Cells in the Surface of Implants

Authors: Saimir Heta, Ilma Robo, Dhimiter Papakozma, Eduart Kapaj, Vera Ostreni

Abstract:

Introduction: The biocompatible materials applied to the implant surfaces are the target of recent literature studies. Methodologies: Modification of implant surfaces in different ways such as application of additional ions, surface microstructure change, surface or laser ultrasound alteration, or application of various substances such as recombinant proteins are among the most affected by articles published in the literature. The study is of review type with the main aim of finding the different ways that the mesenchymal cell reaction to these materials is, according to the literature, in the same percentage positive to the osteointegration process. Results: It is emphasized in the literature that implant success as a key evaluation key has more to implement implant treatment protocol ranging from dental health amenity and subsequent of the choice of implant type depending on the alveolar shape of the ridge level. Conclusions: Osteointegration is a procedure that should initially be physiologically independent of the type of implant pile material. With this physiological process, it can not "boast" for implant success or implantation depending on the brand of the selected implant, as the breadth of synthetic or natural materials that promote osteointegration is relatively large.

Keywords: mesenchymal cells, implants, review, biocompatible materials

Procedia PDF Downloads 86
1975 Epigenetic Mechanisms Involved in the Occurrence and Development of Infectious Diseases

Authors: Frank Boris Feutmba Keutchou, Saurelle Fabienne Bieghan Same, Verelle Elsa Fogang Pokam, Charles Ursula Metapi Meikeu, Angel Marilyne Messop Nzomo, Ousman Tamgue

Abstract:

Infectious diseases are one of the most important causes of morbidity and mortality worldwide. These diseases are caused by micro-pathogenic organisms, such as bacteria, viruses, parasites, and fungi. Heritable changes in gene expression that do not involve changes to the underlying DNA sequence are referred to as epigenetics. Emerging evidence suggests that epigenetic mechanisms are important in the emergence and progression of infectious diseases. Pathogens can manipulate host epigenetic machinery to promote their own replication and evade immune responses. The Human Genome Project has provided new opportunities for developing better tools for the diagnosis and identification of target genes. Several epigenetic modifications, such as DNA methylation, histone modifications, and non-coding RNA expression, have been shown to influence infectious disease outcomes. Understanding the epigenetic mechanisms underlying infectious diseases may result in the progression of new therapeutic approaches focusing on host-pathogen interactions. The goal of this study is to show how different infectious agents interact with host cells after infection.

Keywords: epigenetic, infectious disease, micro-pathogenic organism, phenotype

Procedia PDF Downloads 80
1974 Application of Reception Theory to Analyze the Translation as a Continuous Reception

Authors: Mina Darabi Amin

Abstract:

In 1972, Hans Robert Jauss introduced the Reception Theory a version of Reader-response criticism, that suggests the literary critics to re-examine the relationship between the author, the work and the reader. The revealing of these relationships has shown that, besides the creation, the reception and the reading of the text have different levels which exempt it from a continuous reference to the meaning intended by the artist and could lead to multiplicity of possible interpretations according to the ‘Horizon of Expectations’. This theory could be associated with another intellectual process called ‘translation’, a process that is always confronted by different levels of readers in the target language and different levels of reception by these readers. By adopting the perspective of Reception theory in translation, we could ignore a particular kind of translation and consider the initiation to a literary text, its translation and its reception as a continuous process. Just like the creation of the text, the translation and its reception, are not made once and for all; they are confronted with different levels of reception and interpretation which are made and remade endlessly. After having known and crossing the first levels, the Horizons of Expectation could be extended and the reader could be initiated to the higher levels. On the other hand, we could say that the faithful and free translation are not opposed to each other, but depending on the type of reception by the readers and in a particular moment, the existence of both is necessary. In fact, it is the level of reception in readers and their Horizon of Expectations that determine the degree of fidelity and freedom of translation.

Keywords: reception theory, reading, literary translation, horizons of expectation, reader

Procedia PDF Downloads 182
1973 Design and Fabrication of Optical Nanobiosensors for Detection of MicroRNAs Involved in Neurodegenerative Diseases

Authors: Mahdi Rahaie

Abstract:

MicroRNAs are a novel class of small RNAs which regulate gene expression by translational repression or degradation of messenger RNAs. To produce sensitive, simple and cost-effective assays for microRNAs, detection is in urgent demand due to important role of these biomolecules in progression of human disease such as Alzheimer’s, Multiple sclerosis, and some other neurodegenerative diseases. Herein, we report several novel, sensitive and specific microRNA nanobiosensors which were designed based on colorimetric and fluorescence detection of nanoparticles and hybridization chain reaction amplification as an enzyme-free amplification. These new strategies eliminate the need for enzymatic reactions, chemical changes, separation processes and sophisticated equipment whereas less limit of detection with most specify are acceptable. The important features of these methods are high sensitivity and specificity to differentiate between perfectly matched, mismatched and non-complementary target microRNAs and also decent response in the real sample analysis with blood plasma. These nanobiosensors can clinically be used not only for the early detection of neuro diseases but also for every sickness related to miRNAs by direct detection of the plasma microRNAs in real clinical samples, without a need for sample preparation, RNA extraction and/or amplification.

Keywords: hybridization chain reaction, microRNA, nanobiosensor, neurodegenerative diseases

Procedia PDF Downloads 151
1972 On the Effectiveness of Electricity Market Development Strategies: A Target Model for a Developing Country

Authors: Ezgi Avci-Surucu, Doganbey Akgul

Abstract:

Turkey’s energy reforms has achieved energy security through a variety of interlinked measures including electricity, gas, renewable energy and energy efficiency legislation; the establishment of an energy sector regulatory authority; energy price reform; the creation of a functional electricity market; restructuring of state-owned energy enterprises; and private sector participation through privatization and new investment. However, current strategies, namely; “Electricity Sector Reform and Privatization Strategy” and “Electricity Market and Supply Security Strategy” has been criticized for various aspects. The present paper analyzes the implementation of the aforementioned strategies in the framework of generation scheduling, transmission constraints, bidding structure and general aspects; and argues the deficiencies of current strategies which decelerates power investments and creates uncertainties. We conclude by policy suggestions to eliminate these deficiencies in terms of price and risk management, infrastructure, customer focused regulations and systematic market development.

Keywords: electricity markets, risk management, regulations, balancing and settlement, bilateral trading, generation scheduling, bidding structure

Procedia PDF Downloads 553
1971 Corporate Voluntary Greenhouse Gas Emission Reporting in United Kingdom: Insights from Institutional and Upper Echelons Theories

Authors: Lyton Chithambo

Abstract:

This paper reports the results of an investigation into the extent to which various stakeholder pressures influence voluntary disclosure of greenhouse-gas (GHG) emissions in the United Kingdom (UK). The study, which is grounded on institutional theory, also borrows from the insights of upper echelons theory and examines whether specific managerial (chief executive officer) characteristics explain and moderates various stakeholder pressures in explaining GHG voluntary disclosure. Data were obtained from the 2011 annual and sustainability reports of a sample of 216 UK companies on the FTSE350 index listed on the London Stock Exchange. Generally the results suggest that there is no substantial shareholder and employee pressure on a firm to disclose GHG information but there is significant positive pressure from the market status of a firm with those firms with more market share disclosing more GHG information. Consistent with the predictions of institutional theory, we found evidence that coercive pressure i.e. regulatory pressure and mimetic pressures emanating in some industries notably industrials and consumer services have a significant positive influence on firms’ GHG disclosure decisions. Besides, creditor pressure also had a significant negative relationship with GHG disclosure. While CEO age had a direct negative effect on GHG voluntary disclosure, its moderation effect on stakeholder pressure influence on GHG disclosure was only significant on regulatory pressure. The results have important implications for both policy makers and company boards strategizing to reign in their GHG emissions.

Keywords: greenhouse gases, voluntary disclosure, upper echelons theory, institution theory

Procedia PDF Downloads 233
1970 Fundamentals of Performance Management in the World of Public Service Organizations

Authors: Daniella Kucsma

Abstract:

The examination of the Privat Service Organization’s performance evaluation includes several steps that help Public organizations to develop a more efficient system. Public sector organizations have different characteristics than the competitive sector, so it can be stated that other/new elements become more important in their performance processes. The literature in this area is diverse, so highlighting an indicator system can be useful for introducing a system, but it is also worthwhile to measure the specific elements of the organization. In the case of a public service organization, due to the service obligation, it is usually possible to talk about a high number of users, so compliance is more difficult. For the organization, it is an important target to place great emphasis on the increase of service standards and the development of related processes. In this research, the health sector is given a prominent role, as it is a sensitive area where both organizational and individual performance is important for all participants. As a primary step, the content of the strategy is decisive, as this is important for the efficient structure of the process. When designing any system, it is important to review the expectations of the stakeholders, as this is primary when considering the design. The goal of this paper is to build the foundations of a performance management and indexing framework that can help a hospital to provide effective feedback and a direction that is important in assessing and developing a service and can become a management philosophy.

Keywords: health sector, public sector, performance management, strategy

Procedia PDF Downloads 193
1969 Investigation of the Speckle Pattern Effect for Displacement Assessments by Digital Image Correlation

Authors: Salim Çalışkan, Hakan Akyüz

Abstract:

Digital image correlation has been accustomed as a versatile and efficient method for measuring displacements on the article surfaces by comparing reference subsets in undeformed images with the define target subset in the distorted image. The theoretical model points out that the accuracy of the digital image correlation displacement data can be exactly anticipated based on the divergence of the image noise and the sum of the squares of the subset intensity gradients. The digital image correlation procedure locates each subset of the original image in the distorted image. The software then determines the displacement values of the centers of the subassemblies, providing the complete displacement measures. In this paper, the effect of the speckle distribution and its effect on displacements measured out plane displacement data as a function of the size of the subset was investigated. Nine groups of speckle patterns were used in this study: samples are sprayed randomly by pre-manufactured patterns of three different hole diameters, each with three coverage ratios, on a computer numerical control punch press. The resulting displacement values, referenced at the center of the subset, are evaluated based on the average of the displacements of the pixel’s interior the subset.

Keywords: digital image correlation, speckle pattern, experimental mechanics, tensile test, aluminum alloy

Procedia PDF Downloads 74
1968 Application of Data Mining Techniques for Tourism Knowledge Discovery

Authors: Teklu Urgessa, Wookjae Maeng, Joong Seek Lee

Abstract:

Application of five implementations of three data mining classification techniques was experimented for extracting important insights from tourism data. The aim was to find out the best performing algorithm among the compared ones for tourism knowledge discovery. Knowledge discovery process from data was used as a process model. 10-fold cross validation method is used for testing purpose. Various data preprocessing activities were performed to get the final dataset for model building. Classification models of the selected algorithms were built with different scenarios on the preprocessed dataset. The outperformed algorithm tourism dataset was Random Forest (76%) before applying information gain based attribute selection and J48 (C4.5) (75%) after selection of top relevant attributes to the class (target) attribute. In terms of time for model building, attribute selection improves the efficiency of all algorithms. Artificial Neural Network (multilayer perceptron) showed the highest improvement (90%). The rules extracted from the decision tree model are presented, which showed intricate, non-trivial knowledge/insight that would otherwise not be discovered by simple statistical analysis with mediocre accuracy of the machine using classification algorithms.

Keywords: classification algorithms, data mining, knowledge discovery, tourism

Procedia PDF Downloads 295
1967 Study Case of Spacecraft Instruments in Structural Modelling with Nastran-Patran

Authors: Francisco Borja de Lara, Ali Ravanbakhsh, Robert F. Wimmer-Schweingruber, Lars Seimetz, Fermín Navarro

Abstract:

The intense structural loads during the launch of a spacecraft represent a challenge for the space structure designers because enough resistance has to be achieved while maintaining at the same time the mass and volume within the allowable margins of the mission requirements and inside the limits of the budget project. In this conference, we present the structural analysis of the Lunar Lander Neutron Dosimetry (LND) experiment on the Chang'E4 mission, the first probe to land on the moon’s far side included in the Chinese’ Moon Exploration Program by the Chinese National Space Administration. To this target, the software Nastran/Patran has been used: a structural model in Patran and a structural analysis through Nastran have been realized. Next, the results obtained are used both for the optimization process of the spacecraft structure, and as input parameters for the model structural test campaign. In this way, the feasibility of the lunar instrument structure is demonstrated in terms of the modal modes, stresses, and random vibration and a better understanding of the structural tests design is provided by our results.

Keywords: Chang’E4, Chinese national space administration, lunar lander neutron dosimetry, nastran-patran, structural analysis

Procedia PDF Downloads 529
1966 Corpus-Based Description of Core English Nouns of Pakistani English, an EFL Learner Perspective at Secondary Level

Authors: Abrar Hussain Qureshi

Abstract:

Vocabulary has been highlighted as a key indicator in any foreign language learning program, especially English as a foreign language (EFL). It is often considered a potential tool in foreign language curriculum, and its deficiency impedes successful communication in the target language. The knowledge of the lexicon is very significant in getting communicative competence and performance. Nouns constitute a considerable bulk of English vocabulary. Rather, they are the bones of the English language and are the main semantic carrier in spoken and written discourse. As nouns dominate the bulk of the English lexicon, their role becomes all the more potential. The undertaken research is a systematic effort in this regard to work out a list of highly frequent list of Pakistani English nouns for the EFL learners at the secondary level. It will encourage autonomy for the EFL learners as well as will save their time. The corpus used for the research has been developed locally from leading English newspapers of Pakistan. Wordsmith Tools has been used to process the research data and to retrieve word list of frequent Pakistani English nouns. The retrieved list of core Pakistani English nouns is supposed to be useful for English language learners at the secondary level as it covers a wide range of speech events.

Keywords: corpus, EFL, frequency list, nouns

Procedia PDF Downloads 103
1965 Phytochemical and Biological Study of Chrozophora oblongifolia

Authors: Al-Braa Kashegari, Ali M. El-Halawany, Akram A. Shalabi, Sabrin R. M. Ibrahim, Hossam M. Abdallah

Abstract:

Chemical investigation of Chrozophora oblongifolia resulted in the isolation of five major compounds that were identified as apeginin-7-O-glucoside (1), quercetin-3-O-glucuronic acid (2), quercetin-3-O-glacturonic acid (3), rutin (4), and 1,3,6-trigalloyl glucose (5). The identity of isolated compounds was assessed by different spectroscopic methods, including one- and two-dimensional NMR. The isolated compounds were tested for their antioxidant activity using different assays viz., DPPH, FRAP, ABTS, ORAC, and metal chelation effects. In addition, the inhibition of target enzymes involved in the metabolic syndrome, such as alpha-glucosidase and pancreatic lipase, were carried out. Moreover, the effect of the compounds on the advanced glycation end-products (AGEs) as one of the major complications of oxidative stress and hyperglycemia in metabolic syndromes were carried out using BSA‐fructose (bovine serum albumin), BSA-methylglyoxal, and arginine methylglyoxal models. The pure isolates showed a protective effect in metabolic syndromes as well as promising antioxidant activity. The results showed potent activity of compound 5 in all measured parameters meanwhile, none of the tested compounds showed activity against pancreatic lipase.

Keywords: Chrozophora oblongifolia, antioxidant, pancreatic lipase, metabolic syndromes

Procedia PDF Downloads 111
1964 Marketing Strategy Adjustment of Multinational Companines in China in the New Period

Authors: Xue Junwei

Abstract:

The rapid economic development of China has made it a critical global market. Multinational companies operating in China face evolving challenges, necessitating adjustments in their marketing strategies. This study uses SWOT analysis and qualitative research methods to explore the trends and countermeasures for adjusting the marketing strategies of multinational companies in China. The research employs the SWOT analysis, quantitative as well as qualitative research techniques to investigate the marketing strategy adjustments of multinational companies in China. The study reveals emerging trends and proposes strategic countermeasures for multinational companies to adapt their marketing strategies in the Chinese market. This research contributes to the existing literature by providing insights into the dynamic environment of multinational companies in China and offering practical recommendations for strategy adjustments. Data were collected using qualitative research methods, including interviews and case studies, and quantitative research methods, such as questionnaires to study multinational companies in China. The collected data were analyzed using SWOT analysis to identify the strengths, weaknesses, opportunities, and threats faced by multinational companies in China, guiding the formulation of effective marketing strategies. This study addresses the challenges faced by multinational companies in China, the need for strategic adjustments, and the potential approaches to enhancing marketing effectiveness in this market. The study emphasizes the significance of adapting marketing strategies to align with the changing landscape of the Chinese market. It provides actionable recommendations for multinational companies to thrive in this environment.

Keywords: multinational company, marketing strategies, Chinese market, SWOT

Procedia PDF Downloads 6
1963 Assertion-Driven Test Repair Based on Priority Criteria

Authors: Ruilian Zhao, Shukai Zhang, Yan Wang, Weiwei Wang

Abstract:

Repairing broken test cases is an expensive and challenging task in evolving software systems. Although an automated repair technique with intent preservation has been proposed, but it does not take into account the association between test repairs and assertions, leading to a large number of irrelevant candidates and decreasing the repair capability. This paper proposes an assertion-driven test repair approach. Furthermore, an intent-oriented priority criterion is raised to guide the repair candidate generation, making the repairs closer to the intent of the test. In more detail, repair targets are determined through post-dominance relations between assertions and the methods that directly cause compilation errors. Then, test repairs are generated from the target in a bottom-up way, guided by the intent-oriented priority criteria. Finally, the generated repair candidates are prioritized to match the original test intent. The approach is implemented and evaluated on the benchmark of 4 open-source programs and 91 broken test cases. The result shows that the approach can fix 89% (81/91) of broken test cases, which is more effective than the existing intentpreserved test repair approach, and our intent-oriented priority criteria work well.

Keywords: test repair, test intent, software test, test case evolution

Procedia PDF Downloads 129
1962 Optimal Allocation of Multiple Emergency Resources for a Single Potential Accident Node: A Mixed Integer Linear Program

Authors: Yongjian Du, Jinhua Sun, Kim M. Liew, Huahua Xiao

Abstract:

Optimal allocation of emergency resources before a disaster is of great importance for emergency response. In reality, the pre-protection for a single critical node where accidents may occur is common. In this study, a model is developed to determine location and inventory decisions of multiple emergency resources among a set of candidate stations to minimize the total cost based on the constraints of budgetary and capacity. The total cost includes the economic accident loss which is accorded with probability distribution of time and the warehousing cost of resources which is increasing over time. A ratio is set to measure the degree of a storage station only serving the target node that becomes larger with the decrease of the distance between them. For the application of linear program, it is assumed that the length of travel time to the accident scene of emergency resources has a linear relationship with the economic accident loss. A computational experiment is conducted to illustrate how the proposed model works, and the results indicate its effectiveness and practicability.

Keywords: emergency response, integer linear program, multiple emergency resources, pre-allocation decisions, single potential accident node

Procedia PDF Downloads 154
1961 Investigating the Challenges and Opportunities for M-Government Implementation in Saudi Arabia

Authors: Anan Alssbaiheen, Steve Love

Abstract:

Given the lack of research into potential opportunities and challenges which are likely to be associated with the implementation of mobile services in developing countries including Saudi Arabia, the research reported here investigated the challenges and opportunities which are associated with the implementation of mobile government services in Saudi Arabia. By collecting data through surveys from 103 Saudi citizens and 46 employees working at the Ministry of Communication and Information Technology Saudi Arabia, this study indicates that the high level of mobile penetration in the country offers an opportunity for Saudi Arabian government to offer mobile government services in the country. The results also suggest that though a large percentage of populations do not have access to mobile technologies, there is still a strong desire among users for the provision of mobile government services. Moreover, the results suggest that effective implementation of mobile government services would help to increase the technological development of Saudi Arabia. However, there are certain challenges which may prevent the effective implementation of such services. First, there does not appear to be a sufficient level of understanding among the Saudi Arabian population about the benefits which are associated with mobile government services. Secondly, the results suggest that the implementation of the services needs to be closely tailored and personalised to the individual needs of target users. Finally, the lack of access to mobile technologies would be a challenge to the successful introduction of these services.

Keywords: challenges, e-government, mobile government, opportunities

Procedia PDF Downloads 414
1960 High Performance Field Programmable Gate Array-Based Stochastic Low-Density Parity-Check Decoder Design for IEEE 802.3an Standard

Authors: Ghania Zerari, Abderrezak Guessoum, Rachid Beguenane

Abstract:

This paper introduces high-performance architecture for fully parallel stochastic Low-Density Parity-Check (LDPC) field programmable gate array (FPGA) based LDPC decoder. The new approach is designed to decrease the decoding latency and to reduce the FPGA logic utilisation. To accomplish the target logic utilisation reduction, the routing of the proposed sub-variable node (VN) internal memory is designed to utilize one slice distributed RAM. Furthermore, a VN initialization, using the channel input probability, is achieved to enhance the decoder convergence, without extra resources and without integrating the output saturated-counters. The Xilinx FPGA implementation, of IEEE 802.3an standard LDPC code, shows that the proposed decoding approach attain high performance along with reduction of FPGA logic utilisation.

Keywords: low-density parity-check (LDPC) decoder, stochastic decoding, field programmable gate array (FPGA), IEEE 802.3an standard

Procedia PDF Downloads 297
1959 Comparison of Different k-NN Models for Speed Prediction in an Urban Traffic Network

Authors: Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

A database that records average traffic speeds measured at five-minute intervals for all the links in the traffic network of a metropolitan city. While learning from this data the models that can predict future traffic speed would be beneficial for the applications such as the car navigation system, building predictive models for every link becomes a nontrivial job if the number of links in a given network is huge. An advantage of adopting k-nearest neighbor (k-NN) as predictive models is that it does not require any explicit model building. Instead, k-NN takes a long time to make a prediction because it needs to search for the k-nearest neighbors in the database at prediction time. In this paper, we investigate how much we can speed up k-NN in making traffic speed predictions by reducing the amount of data to be searched for without a significant sacrifice of prediction accuracy. The rationale behind this is that we had a better look at only the recent data because the traffic patterns not only repeat daily or weekly but also change over time. In our experiments, we build several different k-NN models employing different sets of features which are the current and past traffic speeds of the target link and the neighbor links in its up/down-stream. The performances of these models are compared by measuring the average prediction accuracy and the average time taken to make a prediction using various amounts of data.

Keywords: big data, k-NN, machine learning, traffic speed prediction

Procedia PDF Downloads 363
1958 The Effect of Energy Consumption and Losses on the Nigerian Manufacturing Sector: Evidence from the ARDL Approach

Authors: Okezie A. Ihugba

Abstract:

The bounds testing ARDL (2, 2, 2, 2, 0) technique to cointegration was used in this study to investigate the effect of energy consumption and energy loss on Nigeria's manufacturing sector from 1981 to 2020. The model was created to determine the relationship between these three variables while also accounting for interactions with control variables such as inflation and commercial bank loans to the manufacturing sector. When the dependent variables are energy consumption and energy loss, the bounds tests show that the variables of interest are bound together in the long run. Because electricity consumption is a critical factor in determining manufacturing value-added in Nigeria, some intriguing observations were made. According to the findings, the relationship between LELC and LMVA is statistically significant. According to the findings, electricity consumption reduces manufacturing value-added. The target variable (energy loss) is statistically significant and has a positive sign. In Nigeria, a 1% reduction in energy loss increases manufacturing value-added by 36% in the first lag and 35% in the second. According to the study, the government should speed up the ongoing renovation of existing power plants across the country, as well as the construction of new gas-fired power plants. This will address a number of issues, including overpricing of electricity as a result of grid failure.

Keywords: L60, Q43, H81, C52, E31, ARDL, cointegration, Nigeria's manufacturing

Procedia PDF Downloads 177
1957 Use of Multistage Transition Regression Models for Credit Card Income Prediction

Authors: Denys Osipenko, Jonathan Crook

Abstract:

Because of the variety of the card holders’ behaviour types and income sources each consumer account can be transferred to a variety of states. Each consumer account can be inactive, transactor, revolver, delinquent, defaulted and requires an individual model for the income prediction. The estimation of transition probabilities between statuses at the account level helps to avoid the memorylessness of the Markov Chains approach. This paper investigates the transition probabilities estimation approaches to credit cards income prediction at the account level. The key question of empirical research is which approach gives more accurate results: multinomial logistic regression or multistage conditional logistic regression with binary target. Both models have shown moderate predictive power. Prediction accuracy for conditional logistic regression depends on the order of stages for the conditional binary logistic regression. On the other hand, multinomial logistic regression is easier for usage and gives integrate estimations for all states without priorities. Thus further investigations can be concentrated on alternative modeling approaches such as discrete choice models.

Keywords: multinomial regression, conditional logistic regression, credit account state, transition probability

Procedia PDF Downloads 487
1956 Gynocentrism and Self-Orientalization: A Visual Trend in Chinese Fashion Photography

Authors: Zhen Sun

Abstract:

The study adopts the method of visual social semiotics to analyze a sample of fashion photos that were recently published in Chinese fashion magazines that target towards both male and female readers. It identifies a new visual trend in fashion photography, which is characterized by two features. First, the photos represent young, confident, and stylish female models with lower-class sloppy old men. The visual inharmony between the sexually desirable women and the aged men has suggested an impossibly accomplished sexuality and eroticism. Though the women are still under the male gaze, they are depicted as unreachable objects of voyeurism other than sexual objects subordinated to men. Second, the represented people are usually put in the backdrop of tasteless or vulgar Chinese town life, which is congruent with the images of men but makes the modern city girls out of place. The photographers intentionally contrast the images of women with that of men and with the background, which implies an imaginary binary division of modern Orientalism and the photographers’ self-orientalization strategy. Under the theoretical umbrella of neoliberal postfeminism, this study defines a new kind of gynocentric stereotype in Chinese fashion photography, which challenges the previous observations on gender portrayals in fashion magazines.

Keywords: fashion photography, gynocentrism, neoliberal postfeminism, self-orientalization

Procedia PDF Downloads 424
1955 A Framework for Secure Information Flow Analysis in Web Applications

Authors: Ralph Adaimy, Wassim El-Hajj, Ghassen Ben Brahim, Hazem Hajj, Haidar Safa

Abstract:

Huge amounts of data and personal information are being sent to and retrieved from web applications on daily basis. Every application has its own confidentiality and integrity policies. Violating these policies can have broad negative impact on the involved company’s financial status, while enforcing them is very hard even for the developers with good security background. In this paper, we propose a framework that enforces security-by-construction in web applications. Minimal developer effort is required, in a sense that the developer only needs to annotate database attributes by a security class. The web application code is then converted into an intermediary representation, called Extended Program Dependence Graph (EPDG). Using the EPDG, the provided annotations are propagated to the application code and run against generic security enforcement rules that were carefully designed to detect insecure information flows as early as they occur. As a result, any violation in the data’s confidentiality or integrity policies is reported. As a proof of concept, two PHP web applications, Hotel Reservation and Auction, were used for testing and validation. The proposed system was able to catch all the existing insecure information flows at their source. Moreover and to highlight the simplicity of the suggested approaches vs. existing approaches, two professional web developers assessed the annotation tasks needed in the presented case studies and provided a very positive feedback on the simplicity of the annotation task.

Keywords: web applications security, secure information flow, program dependence graph, database annotation

Procedia PDF Downloads 471
1954 Commercial Automobile Insurance: A Practical Approach of the Generalized Additive Model

Authors: Nicolas Plamondon, Stuart Atkinson, Shuzi Zhou

Abstract:

The insurance industry is usually not the first topic one has in mind when thinking about applications of data science. However, the use of data science in the finance and insurance industry is growing quickly for several reasons, including an abundance of reliable customer data, ferocious competition requiring more accurate pricing, etc. Among the top use cases of data science, we find pricing optimization, customer segmentation, customer risk assessment, fraud detection, marketing, and triage analytics. The objective of this paper is to present an application of the generalized additive model (GAM) on a commercial automobile insurance product: an individually rated commercial automobile. These are vehicles used for commercial purposes, but for which there is not enough volume to apply pricing to several vehicles at the same time. The GAM model was selected as an improvement over GLM for its ease of use and its wide range of applications. The model was trained using the largest split of the data to determine model parameters. The remaining part of the data was used as testing data to verify the quality of the modeling activity. We used the Gini coefficient to evaluate the performance of the model. For long-term monitoring, commonly used metrics such as RMSE and MAE will be used. Another topic of interest in the insurance industry is to process of producing the model. We will discuss at a high level the interactions between the different teams with an insurance company that needs to work together to produce a model and then monitor the performance of the model over time. Moreover, we will discuss the regulations in place in the insurance industry. Finally, we will discuss the maintenance of the model and the fact that new data does not come constantly and that some metrics can take a long time to become meaningful.

Keywords: insurance, data science, modeling, monitoring, regulation, processes

Procedia PDF Downloads 76
1953 Assessment of Work Postures and Prevalence of Musculoskeletal Disorders among Diamond Polishers in Botswana: A Case Study

Authors: Oanthata Jester Sealetsa, Richie Moalosi

Abstract:

Musculoskeletal Disorders (MSDs) are reported to be amongst the leading contributing factors of low productivity in many industries across the world, and the most affected being New Emerging Economies (NEC) such as Botswana. This is due to lack of expertise and resources to deal with existing ergonomics challenges. This study was aimed to evaluate occupational postures and the prevalence of musculoskeletal disorders among diamond polishers in a diamond company in Botswana. A case study was conducted with about 106 diamond polishers in Gaborone, Botswana. A case study was chosen because it can investigate and explore an issue thoroughly and deeply, and record behaviour over time so changes in behaviour can be identified. The Corlett and Bishop Body Map was used to determine frequency of MSDs symptoms in different body parts of the workers. This was then followed by the use of the Rapid Entire Body Assessment (REBA) to evaluate the occupational postural risks of MSDs. Descriptive statistics, chi square, and logistic regression were used for data analysis. The results of the study reveal that workers experienced pain in the upper back, lower back, shoulders, neck, and wrists with the most pain reported in the upper back (44.6%) and lower back (44.2%). However, the mean REBA score of 6.07 suggests that sawing, bruiting and polishing were the most dangerous processes in diamond polishing. The study recommends that a redesign of the diamond polishing workstations is necessary to accommodate the anthropometry characteristic of Batswana (people from Botswana) to prevent the development of MSDs.

Keywords: assessment, Botswana, diamond polishing, ergonomics, musculoskeletal disorders, occupational postural risks

Procedia PDF Downloads 180
1952 Implementation of a Program of Orientation for Travel Nursing Staff Based on Nurse-Identified Learning Needs

Authors: Olga C. Rodrigue

Abstract:

Long-term care and skilled nursing facilities experience ebbs and flows of nursing staffing, a problem compounded by the perception of the facilities as undesirable workplaces and competition for staff from other healthcare entities. Travel nurses are contracted to fill staffing needs due to increased admissions, increased and unexpected attrition of nurses, or facility expansion of services. Prior to beginning the contracted assignment, the travel nurse must meet industry, company, and regulatory requirements (The Joint Commission and CMS) for skills and knowledge. Travel nurses, however, inconsistently receive the pre-assignment orientation needed to work at the contracted facility, if any information is given at all. When performance expectations are not met, travel nurses may subsequently choose to leave the position without completing the terms of the contract, and some facilities may choose to terminate the contract prior to the expected end date. The overarching goal of the Doctor of Nursing Practice evidence-based practice improvement project is to provide travel nurses with the basic and necessary information to prepare them to begin a long-term and skilled nursing assignment. The project involves the identification of travel nurse learning needs through a survey and the development and provision of web-based learning modules to address those needs prior to arrival for a long-term and skilled nursing assignment.

Keywords: nurse staffing, travel nurse, travel staff, contract staff, contracted assignment, long-term care, skilled nursing, onboarding, orientation, staff development, supplemental staff

Procedia PDF Downloads 168
1951 Open Source Knowledge Management Approach to Manage and Disseminate Distributed Content in a Global Enterprise

Authors: Rahul Thakur, Onkar Chandel

Abstract:

Red Hat is the world leader in providing open source software and solutions. A global enterprise, like Red Hat, has unique issues of connecting employees with content because of distributed offices, multiple teams spread across geographies, multiple languages, and different cultures. Employees, of a global company, create content that is distributed across departments, teams, regions, and countries. This makes finding the best content difficult since owners keep iterating on the existing content. When employees are unable to find the content, they end up creating it once again and in the process duplicating existing material and effort. Also, employees may not find the relevant content and spend time reviewing obsolete duplicate, or irrelevant content. On an average, a person spends 15 minutes/day in failed searches that might result in missed business opportunities, employee frustration, and substandard deliverables. Red Hat Knowledge Management Office (KMO) applied 'open source strategy' to solve the above problems. Under the Open Source Strategy, decisions are taken collectively. The strategy aims at accomplishing common goals with the help of communities. The objectives of this initiative were to save employees' time, get them authentic content, improve their content search experience, avoid duplicate content creation, provide context based search, improve analytics, improve content management workflows, automate content classification, and automate content upload. This session will describe open source strategy, its applicability in content management, challenges, recommended solutions, and outcome.

Keywords: content classification, content management, knowledge management, open source

Procedia PDF Downloads 210