Search results for: implement complex
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6546

Search results for: implement complex

5046 Smartphone-Based Human Activity Recognition by Machine Learning Methods

Authors: Yanting Cao, Kazumitsu Nawata

Abstract:

As smartphones upgrading, their software and hardware are getting smarter, so the smartphone-based human activity recognition will be described as more refined, complex, and detailed. In this context, we analyzed a set of experimental data obtained by observing and measuring 30 volunteers with six activities of daily living (ADL). Due to the large sample size, especially a 561-feature vector with time and frequency domain variables, cleaning these intractable features and training a proper model becomes extremely challenging. After a series of feature selection and parameters adjustment, a well-performed SVM classifier has been trained.

Keywords: smart sensors, human activity recognition, artificial intelligence, SVM

Procedia PDF Downloads 139
5045 Cellular Automata Using Fractional Integral Model

Authors: Yasser F. Hassan

Abstract:

In this paper, a proposed model of cellular automata is studied by means of fractional integral function. A cellular automaton is a decentralized computing model providing an excellent platform for performing complex computation with the help of only local information. The paper discusses how using fractional integral function for representing cellular automata memory or state. The architecture of computing and learning model will be given and the results of calibrating of approach are also given.

Keywords: fractional integral, cellular automata, memory, learning

Procedia PDF Downloads 406
5044 Barriers Facing the Implementation of Lean Manufacturing in Libyan Manufacturing Companies

Authors: Mohamed Abduelmula, Martin Birkett, Chris Connor

Abstract:

Lean Manufacturing has developed from being a set of tools and methods to becoming a management philosophy which can be used to remove or reduce waste in manufacturing processes and so enhance the operational productivity of an enterprise. Several enterprises around the world have applied the lean manufacturing system and gained great improvements. This paper investigates the barriers and obstacles that face Libyan manufacturing companies to implement lean manufacturing. A mixed-method approach is suggested, starting with conducting a questionnaire to get quantitative data then using this to develop semi-structured interviews to collect qualitative data. The findings of the questionnaire results and how these can be used further develop the semi-structured interviews are then discussed. The survey was distributed to 65 manufacturing companies in Libya, and a response rate of 64.6% was obtained. The results showed that these are five main barriers to implementing lean in Libya, namely organizational culture, skills and expertise, and training program, financial capability, top management, and communication. These barriers were also identified from the literature as being significant obstacles to implementing Lean in other countries industries. Having an understanding of the difficulties that face the implementation of lean manufacturing systems, as a new and modern system and using this to develop a suitable framework will help to improve the manufacturing sector in Libya.

Keywords: lean manufacturing, barriers, questionnaire, Libyan manufacturing companies

Procedia PDF Downloads 238
5043 A Method for Harvesting Atmospheric Lightning-Energy and Utilization of Extra Generated Power of Nuclear Power Plants during the Low Energy Demand Periods

Authors: Akbar Rahmani Nejad, Pejman Rahmani Nejad, Ahmad Rahmani Nejad

Abstract:

we proposed the arresting of atmospheric lightning and passing the electrical current of lightning-bolts through underground water tanks to produce Hydrogen and restoring Hydrogen in reservoirs to be used later as clean and sustainable energy. It is proposed to implement this method for storage of extra electrical power (instead of lightning energy) during low energy demand periods to produce hydrogen as a clean energy source to store in big reservoirs and later generate electricity by burning the stored hydrogen at an appropriate time. This method prevents the complicated process of changing the output power of nuclear power plants. It is possible to pass an electric current through sodium chloride solution to produce chlorine and sodium or human waste to produce Methane, etc. however atmospheric lightning is an accidental phenomenon, but using this free energy just by connecting the output of lightning arresters to the output of power plant during low energy demand period which there is no significant change in the design of power plant or have no cost, can be considered completely an economical design

Keywords: hydrogen gas, lightning energy, power plant, resistive element

Procedia PDF Downloads 135
5042 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood

Authors: Randa Alharbi, Vladislav Vyshemirsky

Abstract:

Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.

Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)

Procedia PDF Downloads 198
5041 Interval Type-2 Fuzzy Vibration Control of an ERF Embedded Smart Structure

Authors: Chih-Jer Lin, Chun-Ying Lee, Ying Liu, Chiang-Ho Cheng

Abstract:

The main objective of this article is to present the semi-active vibration control using an electro-rheological fluid embedded sandwich structure for a cantilever beam. ER fluid is a smart material, which cause the suspended particles polarize and connect each other to form chain. The stiffness and damping coefficients of the ER fluid can be changed in 10 micro seconds; therefore, ERF is suitable to become the material embedded in the tunable vibration absorber to become a smart absorber. For the ERF smart material embedded structure, the fuzzy control law depends on the experimental expert database and the proposed self-tuning strategy. The electric field is controlled by a CRIO embedded system to implement the real application. This study investigates the different performances using the Type-1 fuzzy and interval Type-2 fuzzy controllers. The Interval type-2 fuzzy control is used to improve the modeling uncertainties for this ERF embedded shock absorber. The self-tuning vibration controllers using Type-1 and Interval Type-2 fuzzy law are implemented to the shock absorber system. Based on the resulting performance, Internal Type-2 fuzzy is better than the traditional Type-1 fuzzy control for this vibration control system.

Keywords: electro-rheological fluid, semi-active vibration control, shock absorber, type 2 fuzzy control

Procedia PDF Downloads 441
5040 The Beam Expansion Method, A Simplified and Efficient Approach of Field Propagation and Resonators Modes Study

Authors: Zaia Derrar Kaddour

Abstract:

The study of a beam throughout an optical path is generally achieved by means of diffraction integral. Unfortunately, in some problems, this tool turns out to be not very friendly and hard to implement. Instead, the beam expansion method for computing field profiles appears to be an interesting alternative. The beam expansion method consists of expanding the field pattern as a series expansion in a set of orthogonal functions. Propagating each individual component through a circuit and adding up the derived elements leads easily to the result. The problem is then reduced to finding how the expansion coefficients change in a circuit. The beam expansion method requires a systematic study of each type of optical element that can be met in the considered optical path. In this work, we analyze the following fundamental elements: first order optical systems, hard apertures and waveguides. We show that the former element type is completely defined thanks to the Gouy phase shift expression we provide and the latters require a suitable mode conversion. For endorsing the usefulness and relevance of the beam expansion approach, we show here some of its applications such as the treatment of the thermal lens effect and the study of unstable resonators.

Keywords: gouy phase shift, modes, optical resonators, unstable resonators

Procedia PDF Downloads 55
5039 Economic Perspectives for Agriculture and Forestry Owners in Bulgaria

Authors: Todor Nickolov Stoyanov

Abstract:

These factors appear as a reason for difficulties in financing from programs for rural development of the European Union. Credit conditions for commercial banks are difficult to implement, and its interest rate is too high. One of the possibilities for short-term loans at preferential conditions for the small and medium-sized agricultural and forest owners is credit cooperative. After the changes, occurred in the country after 1990, the need to restore credit cooperatives raised. The purpose for the creation of credit cooperatives is to assist private agricultural and forest owners to take care for them, to assist in the expansion and strengthening of their farms, to increase the quality of life and to improve the local economy. It was found that: in Bulgaria there is a legal obstacle for credit cooperatives to expand their business in the deposit and lending sphere; private forest and agricultural owners need small loans to solve a small problem for a certain season; providing such loans is not attractive for banks, but it is extremely necessary for owners of small forests and lands; if a special law on credit cooperatives is adopted, as required by the Cooperatives Act, it will allow more local people to be members of such credit structures and receive the necessary loans. In conclusion, proposals to create conditions for the development of credit cooperatives in the country are made and positive results expected from the creation of credit cooperatives, are summarized.

Keywords: cooperatives, credit cooperatives, forestry, forest owners

Procedia PDF Downloads 217
5038 Toward a Coalitional Subject in Contemporary American Feminist Literature

Authors: Su-Lin Yu

Abstract:

Coalition politics has been one of feminists’ persistent concerns. Following recent feminist discussion on new modes of affiliation across difference, she will explore how the process of female subject formation depends on alliances across different cultural locations. First, she will examine how coalition politics is reformulated across difference in contemporary feminist literature. In particular, the paper will identify the particular contexts and locations in which coalition building both enables and constrains the female subject. She will attempt to explore how contemporary feminist literature highlights the possibilities and limitations for solidarity and affiliations. To understand coalition politics in contemporary feminist works, she will engage in close readings of two texts: Rebecca Walker’s Black, White and Jewish: Memoir of a Shifting Self and Danzy Senna’s Caucasia. Both Walker and Senna have articulated the complex nodes of identity that are staged by a politics of location as they refuse to be boxed into simplistic essentialist positions. Their texts are characterized by the characters’ racial ambiguity and their social and geographical mobility of life in the contemporary United States. Their experiences of living through conflictual and contradictory relationships never fully fit the boundaries of racial categorization. Each of these texts demonstrates the limits as well as the possibilities of working with diversity among and within persons and groups, thus, laying the ground for complex alliance formation. Because each of the protagonists must negotiate a set of contradictions, they will have to constantly shift their affiliations. Rather than construct a static alliance, they describe a process of moving ‘beyond boundaries,’ an embracing of multiple locations. As self-identified third wavers, Rebecca Walker and Danzy Senna have been identified and marked with the status of ‘leader’ by the feminist establishment and by mainstream U.S. media. Their texts have captured both mass popularity and critical attention in the feminist and, often, the non-feminist literary community. By analyzing these texts, she will show how contemporary American feminist literature reveals coalition politics which is fraught with complications and unintended consequences. Taken as a whole, then, these works provide an important examination not only of coalition politics of American feminism, but also a snapshot of a central debate among feminist critique of coalition politics as a whole.

Keywords: coalition politics, contemporary women’s literature, identity, female subject

Procedia PDF Downloads 285
5037 Measuring Delay Using Software Defined Networks: Limitations, Challenges, and Suggestions for Openflow

Authors: Ahmed Alutaibi, Ganti Sudhakar

Abstract:

Providing better Quality-of-Service (QoS) to end users has been a challenging problem for researchers and service providers. Building applications relying on best effort network protocols hindered the adoption of guaranteed service parameters and, ultimately, Quality of Service. The introduction of Software Defined Networking (SDN) opened the door for a new paradigm shift towards a more controlled programmable configurable behavior. Openflow has been and still is the main implementation of the SDN vision. To facilitate better QoS for applications, the network must calculate and measure certain parameters. One of those parameters is the delay between the two ends of the connection. Using the power of SDN and the knowledge of application and network behavior, SDN networks can adjust to different conditions and specifications. In this paper, we use the capabilities of SDN to implement multiple algorithms to measure delay end-to-end not only inside the SDN network. The results of applying the algorithms on an emulated environment show that we can get measurements close to the emulated delay. The results also show that depending on the algorithm, load on the network and controller can differ. In addition, the transport layer handshake algorithm performs best among the tested algorithms. Out of the results and implementation, we show the limitations of Openflow and develop suggestions to solve them.

Keywords: software defined networking, quality of service, delay measurement, openflow, mininet

Procedia PDF Downloads 159
5036 Site Specific Nutrient Management Need in India Now

Authors: A. H. Nanher, N. P. Singh, Shashidhar Yadav, Sachin Tyagi

Abstract:

Agricultural production system is an outcome of a complex interaction of seed, soil, water and agro-chemicals (including fertilizers). Therefore, judicious management of all the inputs is essential for the sustainability of such a complex system. Precision agriculture gives farmers the ability to use crop inputs more effectively including fertilizers, pesticides, tillage and irrigation water. More effective use of inputs means greater crop yield and/or quality, without polluting the environment the focus on enhancing the productivity during the Green Revolution coupled with total disregard of proper management of inputs and without considering the ecological impacts, has resulted into environmental degradation. To evaluate a new approach for site-specific nutrient management (SSNM). Large variation in initial soil fertility characteristics and indigenous supply of N, P, and K was observed among Field- and season-specific NPK applications were calculated by accounting for the indigenous nutrient supply, yield targets, and nutrient demand as a function of the interactions between N, P, and K. Nitrogen applications were fine-tuned based on season-specific rules and field-specific monitoring of crop N status. The performance of SSNM did not differ significantly between high-yielding and low-yielding climatic seasons, but improved over time with larger benefits observed in the second year Future, strategies for nutrient management in intensive rice systems must become more site-specific and dynamic to manage spatially and temporally variable resources based on a quantitative understanding of the congruence between nutrient supply and crop demand. The SSNM concept has demonstrated promising agronomic and economic potential. It can be used for managing plant nutrients at any scale, i.e., ranging from a general recommendation for homogenous management of a larger domain to true management of between-field variability. Assessment of pest profiles in FFP and SSNM plots suggests that SSNM may also reduce pest incidence, particularly diseases that are often associated with excessive N use or unbalanced plant nutrition.

Keywords: nutrient, pesticide, crop, yield

Procedia PDF Downloads 420
5035 The Flypaper Effect and the Municipal Participation Fund in the Brazilian Public Sector

Authors: Lucas Oliveira Gomes Ferreira, André Luiz Marques Serrano

Abstract:

The fiscal decentralization driven by the 1988 Constitution was responsible for granting greater autonomy to Brazilian subnational entities, as states and municipalities were entrusted with greater responsibilities to provide local public goods and services. However, the revenues necessary to implement the new attributions are largely received through intergovernmental transfers and not by local tax collection. The literature points out that public spending increases more by receiving unconditional and nonmatching (lump sum) intergovernmental grants than by an increase in taxpayers' income. This effect, called the flypaper effect, happens because the funds received could be used to reduce local taxes, meaning an increase in the citizen's private income. However, they are applied in the public sector in the form of expenses. The present work investigates the existence of the flypaper effect in Brazilian municipalities during the first two decades of the 21st century. The research uses the Municipal Participation Fund (FPM) as a grant proxy from 2000 to 2019 through econometrics of cross-section and panel data for all 5,568 municipalities. The results indicate the flypaper effect in Brazilian municipalities, as well as the proportional relationship between the receipt of constitutional transfers and the increase in public expenditure.

Keywords: flypaper effect, intergovernmental transfers, municipal participation fund, fiscal federalism

Procedia PDF Downloads 139
5034 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 161
5033 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 148
5032 Analysing Waste Management Options in the Printing Industry: Case of a South African Company

Authors: Stanley Fore

Abstract:

The case study company is one of the leading newsprint companies in South Africa. The company has achieved this status through operational expansion, diversification and investing in cutting-edge technology. They have a reputation for the highest quality and personalised service that transcends borders and industries. The company offers a wide variety of small and large scales printing services. The company is faced with the challenge of significant waste production during normal operations. The company generates 1200 kg of plastic waste and 60 – 70 tonnes of paper waste per month. The company operates a waste management process currently, whereby waste paper is sold, at low cost, to recycling firms for further processing. Having considered the quantity of waste being generated, the company has embarked on a venture to find a more profitable solution to its current waste production. As waste management and recycling is not the company’s core business, the aim of the venture is to implement a secondary profitable waste process business. The venture will be expedited as a strategic project. This research aims to estimate the financial feasibility of a selected solution as well as the impact of non-financial considerations thereof. The financial feasibility is analysed using metrics such as Payback period; internal rate of return and net present value.

Keywords: waste, printing industry, up-cycling, management

Procedia PDF Downloads 259
5031 A Study on the Implementation of Differentiating Instruction Based on Universal Design for Learning

Authors: Yong Wook Kim

Abstract:

The diversity of students in regular classrooms is increasing due to expand inclusive education and increase multicultural students in South Korea. In this diverse classroom environment, the universal design for learning (UDL) has been proposed as a way to meet both the educational need and social expectation of student achievement. UDL offers a variety of practical teaching methods, one of which is a differentiating instruction. The differentiating instruction has been pointed out resource limitation, organizational resistance, and lacks easy-to-implement framework. However, through the framework provided by the UDL, differentiating instruction is able to be flexible in their implementation. In practice, the UDL and differentiating instruction are complementary, but there is still a lack of research that suggests specific implementation methods that apply both concepts at the same time. This study was conducted to investigate the effects of differentiating instruction strategies according to learner characteristics (readiness, interest, learning profile), components of differentiating instruction (content, process, performance, learning environment), especially UDL principles (representation, behavior and expression, participation) existed in differentiating instruction, and implementation of UDL-based differentiating instruction through the Planning for All Learner (PAL) and UDL Lesson Plan Cycle. It is meaningful that such a series of studies can enhance the possibility of more concrete and realistic UDL-based teaching and learning strategies in the classroom, especially in inclusive settings.

Keywords: universal design for learning, differentiating instruction, UDL lesson plan, PAL

Procedia PDF Downloads 189
5030 Mesoporous Na2Ti3O7 Nanotube-Constructed Materials with Hierarchical Architecture: Synthesis and Properties

Authors: Neumoin Anton Ivanovich, Opra Denis Pavlovich

Abstract:

Materials based on titanium oxide compounds are widely used in such areas as solar energy, photocatalysis, food industry and hygiene products, biomedical technologies, etc. Demand for them has also formed in the battery industry (an example of this is the commercialization of Li4Ti5O12), where much attention has recently been paid to the development of next-generation systems and technologies, such as sodium-ion batteries. This dictates the need to search for new materials with improved characteristics, as well as ways to obtain them that meet the requirements of scalability. One of the ways to solve these problems can be the creation of nanomaterials that often have a complex of physicochemical properties that radically differ from the characteristics of their counterparts in the micro- or macroscopic state. At the same time, it is important to control the texture (specific surface area, porosity) of such materials. In view of the above, among other methods, the hydrothermal technique seems to be suitable, allowing a wide range of control over the conditions of synthesis. In the present study, a method was developed for the preparation of mesoporous nanostructured sodium trititanate (Na2Ti3O7) with a hierarchical architecture. The materials were synthesized by hydrothermal processing and exhibit a complex hierarchically organized two-layer architecture. At the first level of the hierarchy, materials are represented by particles having a roughness surface, and at the second level, by one-dimensional nanotubes. The products were found to have high specific surface area and porosity with a narrow pore size distribution (about 6 nm). As it is known, the specific surface area and porosity are important characteristics of functional materials, which largely determine the possibilities and directions of their practical application. Electrochemical impedance spectroscopy data show that the resulting sodium trititanate has a sufficiently high electrical conductivity. As expected, the synthesized complexly organized nanoarchitecture based on sodium trititanate with a porous structure can be practically in demand, for example, in the field of new generation electrochemical storage and energy conversion devices.

Keywords: sodium trititanate, hierarchical materials, mesoporosity, nanotubes, hydrothermal synthesis

Procedia PDF Downloads 104
5029 Robust Control of a Parallel 3-RRR Robotic Manipulator via μ-Synthesis Method

Authors: A. Abbasi Moshaii, M. Soltan Rezaee, M. Mohammadi Moghaddam

Abstract:

Control of some mechanisms is hard because of their complex dynamic equations. If part of the complexity is resulting from uncertainties, an efficient way for solving that is robust control. By this way, the control procedure could be simple and fast and finally, a simple controller can be designed. One kind of these mechanisms is 3-RRR which is a parallel mechanism and has three revolute joints. This paper aims to robust control a 3-RRR planner mechanism and it presents that this could be used for other mechanisms. So, a significant problem in mechanisms control could be solved. The relevant diagrams are drawn and they show the correctness of control process.

Keywords: 3-RRR, dynamic equations, mechanisms control, structural uncertainty

Procedia PDF Downloads 548
5028 Spatio-Temporal Analysis of Land Use Change and Green Cover Index

Authors: Poonam Sharma, Ankur Srivastav

Abstract:

Cities are complex and dynamic systems that constitute a significant challenge to urban planning. The increasing size of the built-up area owing to growing population pressure and economic growth have lead to massive Landuse/Landcover change resulted in the loss of natural habitat and thus reducing the green covers in urban areas. Urban environmental quality is influenced by several aspects, including its geographical configuration, the scale, and nature of human activities occurring and environmental impacts generated. Cities have transformed into complex and dynamic systems that constitute a significant challenge to urban planning. Cities and their sustainability are often discussed together as the cities stand confronted with numerous environmental concerns as the world becoming increasingly urbanized, and the cities are situated in the mesh of global networks in multiple senses. A rapid transformed urban setting plays a crucial role to change the green area of natural habitats. To examine the pattern of urban growth and to measure the Landuse/Landcover change in Gurgoan in Haryana, India through the integration of Geospatial technique is attempted in the research paper. Satellite images are used to measure the spatiotemporal changes that have occurred in the land use and land cover resulting into a new cityscape. It has been observed from the analysis that drastically evident changes in land use has occurred with the massive rise in built up areas and the decrease in green cover and therefore causing the sustainability of the city an important area of concern. The massive increase in built-up area has influenced the localised temperatures and heat concentration. To enhance the decision-making process in urban planning, a detailed and real world depiction of these urban spaces is the need of the hour. Monitoring indicators of key processes in land use and economic development are essential for evaluating policy measures.

Keywords: cityscape, geospatial techniques, green cover index, urban environmental quality, urban planning

Procedia PDF Downloads 272
5027 Cakrawala Baca Transformation Model into Social Enterprise: A Benchmark Approach from Socentra Agro Mandiri (SAM) and Agritektur

Authors: Syafinatul Fitri

Abstract:

Cakrawala Baca is one of social organization in Indonesia that realize to transform its organization into social enterprise to create more sustainable organization that result more sustainable social impact. Cakrawala Baca implements voluntary system for its organization and it has passive social target. It funds its program by several fund rising activities that depend on donors or sponsor. Therefore social activity that held does not create sustainable social impact. It is different with social enterprise that usually more independent in funding its activity through social business and implement active social target and professional work for organization member. Therefore social enterprise can sustain its organization and then able to create sustainable social impact. Developing transformation model from social movement into social enterprise is the focus of this study. To achieve the aim of study, benchmark approach from successful social enterprise in Indonesia that has previously formed as social movement is employed. The benchmark is conducted through internal and external scanning that result the understanding of how they transformed into social enterprise. After understanding SAM and Agritektur transformation, transformation pattern is formulated based on their transformation similarities. This transformation pattern will be implemented to formulate the transformation plan for Cakrawala Baca to be a social enterprise.

Keywords: social movement/social organization, non-profit organization (NPO), social enterprise, transformation, Benchmarks approach

Procedia PDF Downloads 500
5026 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model

Procedia PDF Downloads 143
5025 Defining Methodology for Multi Model Software Process Improvement Framework

Authors: Aedah Abd Rahman

Abstract:

Software organisations may implement single or multiple frameworks in order to remain competitive. There are wide selection of generic Software Process Improvement (SPI) frameworks, best practices and standards implemented with different focuses and goals. Issues and difficulties emerge in the SPI practices from the context of software development and IT Service Management (ITSM). This research looks into the integration of multiple frameworks from the perspective of software development and ITSM. The research question of this study is how to define steps of methodology to solve the multi model software process improvement problem. The objective of this study is to define the research approach and methodologies to produce a more integrated and efficient Multi Model Process Improvement (MMPI) solution. A multi-step methodology is used which contains the case study, framework mapping and Delphi study. The research outcome has proven the usefulness and appropriateness of the proposed framework in SPI and quality practice in Malaysian software industries. This mixed method research approach is used to tackle problems from every angle in the context of software development and services. This methodology is used to facilitate the implementation and management of multi model environment of SPI frameworks in multiple domains.

Keywords: Delphi study, methodology, multi model software process improvement, service management

Procedia PDF Downloads 257
5024 Development of Building Information Modeling in Property Industry: Beginning with Building Information Modeling Construction

Authors: B. Godefroy, D. Beladjine, K. Beddiar

Abstract:

In France, construction BIM actors commonly evoke the BIM gains for exploitation by integrating of the life cycle of a building. The standardization of level 7 of development would achieve this stage of the digital model. The householders include local public authorities, social landlords, public institutions (health and education), enterprises, facilities management companies. They have a dual role: owner and manager of their housing complex. In a context of financial constraint, the BIM of exploitation aims to control costs, make long-term investment choices, renew the portfolio and enable environmental standards to be met. It assumes a knowledge of the existing buildings, marked by its size and complexity. The information sought must be synthetic and structured, it concerns, in general, a real estate complex. We conducted a study with professionals about their concerns and ways to use it to see how householders could benefit from this development. To obtain results, we had in mind the recurring interrogation of the project management, on the needs of the operators, we tested the following stages: 1) Inculcate a minimal culture of BIM with multidisciplinary teams of the operator then by business, 2) Learn by BIM tools, the adaptation of their trade in operations, 3) Understand the place and creation of a graphic and technical database management system, determine the components of its library so their needs, 4) Identify the cross-functional interventions of its managers by business (operations, technical, information system, purchasing and legal aspects), 5) Set an internal protocol and define the BIM impact in their digital strategy. In addition, continuity of management by the integration of construction models in the operation phase raises the question of interoperability in the control of the production of IFC files in the operator’s proprietary format and the export and import processes, a solution rivaled by the traditional method of vectorization of paper plans. Companies that digitize housing complex and those in FM produce a file IFC, directly, according to their needs without recourse to the model of construction, they produce models business for the exploitation. They standardize components, equipment that are useful for coding. We observed the consequences resulting from the use of the BIM in the property industry and, made the following observations: a) The value of data prevail over the graphics, 3D is little used b) The owner must, through his organization, promote the feedback of technical management information during the design phase c) The operator's reflection on outsourcing concerns the acquisition of its information system and these services, observing the risks and costs related to their internal or external developments. This study allows us to highlight: i) The need for an internal organization of operators prior to a response to the construction management ii) The evolution towards automated methods for creating models dedicated to the exploitation, a specialization would be required iii) A review of the communication of the project management, management continuity not articulating around his building model, it must take into account the environment of the operator and reflect on its scope of action.

Keywords: information system, interoperability, models for exploitation, property industry

Procedia PDF Downloads 138
5023 Readiness of Thai Restaurant in Bangkok in Applying for Certification of Halal Food Services Standard for Tourism

Authors: Pongsiri Kingkan

Abstract:

This research aims to study the Readiness of Thai Restaurant in Bangkok in Applying for Certification of Halal Food Services Standard for Tourism. This research was conduct by using mix methodology; both quantitative and qualitative data were used. 420 questionnaires were used as tools to collected data from the samples, the restaurant employees. The results were divided into two parts, the demographic data and the Readiness of Thai Restaurant in Bangkok in Applying for Certification of Halal Food Services Standard for Tourism. The majority of samples are single female age between 18–30 years old, who earn about 282.40 US dollars a month. The result of Thai restaurant readiness study demonstrated that readiness in foods and restaurant operating processes were scored at the lowest level. Readiness in social responsibility, food contact persons and food materials were rated at the low level. The readiness of utensils and kitchen tools, waste management, environmental management, and the availability of space to implement the establishment of halal food were scored at the average level. Location readiness, foods service safety and the relationship with the local community were rated at high level. But interestingly there is none of them rated at the highest level.

Keywords: availability, Bangkok, halal, Thai restaurant, readiness

Procedia PDF Downloads 311
5022 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic

Authors: Diogen Babuc

Abstract:

The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.

Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison

Procedia PDF Downloads 98
5021 Roles of Tester in Automated World

Authors: Sagar Mahendrakar

Abstract:

Testers' roles have changed dramatically as automation continues to revolutionise the software development lifecycle. There's a general belief that manual testing is becoming outdated with the introduction of advanced testing frameworks and tools. This abstract, however, disproves that notion by examining the complex and dynamic role that testers play in automated environments. In this work, we explore the complex duties that testers have when everything is automated. We contend that although automation increases productivity and simplifies monotonous tasks, it cannot completely replace the cognitive abilities and subject-matter knowledge of human testers. Rather, testers shift their focus to higher-value tasks like creating test strategies, designing test cases, and delving into intricate scenarios that are difficult to automate. We also emphasise the critical role that testers play in guaranteeing the precision, thoroughness, and dependability of automated testing. Testers verify the efficacy of automated scripts and pinpoint areas for improvement through rigorous test planning, execution, and result analysis. They play the role of quality defenders, using their analytical and problem-solving abilities to find minute flaws that computerised tests might miss. Furthermore, the abstract emphasises how testing in automated environments is a collaborative process. In order to match testing efforts with business objectives, improve test automation frameworks, and rank testing tasks according to risk, testers work closely with developers, automation engineers, and other stakeholders. Finally, we discuss how testers in the era of automation need to possess a growing skill set. To stay current, testers need to develop skills in scripting languages, test automation tools, and emerging technologies in addition to traditional testing competencies. Soft skills like teamwork, communication, and flexibility are also essential for productive cooperation in cross-functional teams. This abstract clarifies the ongoing importance of testers in automated settings. Testers can use automation to improve software quality and provide outstanding user experiences by accepting their changing role as strategic partners and advocates for quality.

Keywords: testing, QA, automation, leadership

Procedia PDF Downloads 35
5020 Intersectionality and Sensemaking: Advancing the Conversation on Leadership as the Management of Meaning

Authors: Clifford Lewis

Abstract:

This paper aims to advance the conversation of an alternative view of leadership, namely ‘leadership as the management of meaning’. Here, leadership is considered as a social process of the management of meaning within an employment context, as opposed to a psychological trait, set of behaviours or relational consequence as seen in mainstream leadership research. Specifically, this study explores the relationship between intersectional identities and the management of meaning. Design: Semi-structured, one-on-one interviews were conducted with women and men of colour working in the South African private sector organisations in various leadership positions. Employing an intersectional approach using gender and race, participants were selected by using purposive and snowball sampling concurrently. Thematic and Axial coding was used to identify dominant themes. Findings: Findings suggest that, both gender and race shape how leaders manage meaning. Findings also confirm that intersectionality is an appropriate approach when studying the leadership experiences of those groups who are underrepresented in organisational leadership structures. The findings points to the need for further research into the differential effects of intersecting identities on organisational leadership experiences and that ‘leadership as the management of meaning’ is an appropriate approach for addressing this knowledge gap. Theoretical Contribution: There is a large body of literature on the complex challenges faced by women and people of colour in leadership but there is relatively little empirical work on how identity influences the management of meaning. This study contributes to the leadership literature by providing insight into how intersectional identities influence the management of meaning at work and how this impacts the leadership experiences of largely marginalised groups. Practical Implications: Understanding the leadership experiences of underrepresented groups is important because of both legal mandates and for building diverse talent for organisations and societies. Such an understanding assists practitioners in being sensitive to simplistic notions of challenges individuals might face in accessing and practicing leadership in organisations. Advancing the conversation on leadership as the management of meaning allows for a better understanding of complex challenges faced by women and people of colour and an opportunity for organisations to systematically remove unfair structural obstacles and develop their diverse leadership capacity.

Keywords: intersectionality, diversity, leadership, sensemaking

Procedia PDF Downloads 266
5019 Application of Deep Neural Networks to Assess Corporate Credit Rating

Authors: Parisa Golbayani, Dan Wang, Ionut¸ Florescu

Abstract:

In this work we implement machine learning techniques to financial statement reports in order to asses company’s credit rating. Specifically, the work analyzes the performance of four neural network architectures (MLP, CNN, CNN2D, LSTM) in predicting corporate credit rating as issued by Standard and Poor’s. The paper focuses on companies from the energy, financial, and healthcare sectors in the US. The goal of this analysis is to improve application of machine learning algorithms to credit assessment. To accomplish this, the study investigates three questions. First, we investigate if the algorithms perform better when using a selected subset of important features or whether better performance is obtained by allowing the algorithms to select features themselves. Second, we address the temporal aspect inherent in financial data and study whether it is important for the results obtained by a machine learning algorithm. Third, we aim to answer if one of the four particular neural network architectures considered consistently outperforms the others, and if so under which conditions. This work frames the problem as several case studies to answer these questions and analyze the results using ANOVA and multiple comparison testing procedures.

Keywords: convolutional neural network, long short term memory, multilayer perceptron, credit rating

Procedia PDF Downloads 232
5018 Adverse Impacts of Poor Wastewater Management Practices on Water Quality in Gebeng Industrial Area, Pahang, Malaysia

Authors: I. M. Sujaul, M. A. Sobahan, A. A. Edriyana, F. M. Yahaya, R. M. Yunus

Abstract:

This study was carried out to investigate the adverse effect of industrial waste water on surface water quality in Gebeng industrial estate, Pahang, Malaysia. Surface water was collected from 6 sampling stations. Physico-chemical parameters were characterized based on in-situ and ex-situ analysis according to standard methods by American Public Health Association (APHA). Selected heavy metals were determined by using Inductively Coupled Plasma Mass Spectrometry (ICP MS). The result reveled that the concentration of heavy metals such as Pb, Cu, Cd, Cr and Hg were high in samples. The result showed that the value of Pb and Hg were higher in the wet season in comparison to dry season. According to Malaysia National Water Quality Standard (NWQS) and Water Quality Index (WQI) all the sampling station were categorized as class IV (highly polluted). The present study reveled that the adverse effects of careless disposal of wastes and directly discharge of effluents affected on surface water quality. Therefore, the authorities should implement the laws to ensure the proper practices of waste water management for environmental sustainability around the study area.

Keywords: water, heavy metals, water quality index, Gebeng

Procedia PDF Downloads 372
5017 Deciding Graph Non-Hamiltonicity via a Closure Algorithm

Authors: E. R. Swart, S. J. Gismondi, N. R. Swart, C. E. Bell

Abstract:

We present an heuristic algorithm that decides graph non-Hamiltonicity. All graphs are directed, each undirected edge regarded as a pair of counter directed arcs. Each of the n! Hamilton cycles in a complete graph on n+1 vertices is mapped to an n-permutation matrix P where p(u,i)=1 if and only if the ith arc in a cycle enters vertex u, starting and ending at vertex n+1. We first create exclusion set E by noting all arcs (u, v) not in G, sufficient to code precisely all cycles excluded from G i.e. cycles not in G use at least one arc not in G. Members are pairs of components of P, {p(u,i),p(v,i+1)}, i=1, n-1. A doubly stochastic-like relaxed LP formulation of the Hamilton cycle decision problem is constructed. Each {p(u,i),p(v,i+1)} in E is coded as variable q(u,i,v,i+1)=0 i.e. shrinks the feasible region. We then implement the Weak Closure Algorithm (WCA) that tests necessary conditions of a matching, together with Boolean closure to decide 0/1 variable assignments. Each {p(u,i),p(v,j)} not in E is tested for membership in E, and if possible, added to E (q(u,i,v,j)=0) to iteratively maximize |E|. If the WCA constructs E to be maximal, the set of all {p(u,i),p(v,j)}, then G is decided non-Hamiltonian. Only non-Hamiltonian G share this maximal property. Ten non-Hamiltonian graphs (10 through 104 vertices) and 2000 randomized 31 vertex non-Hamiltonian graphs are tested and correctly decided non-Hamiltonian. For Hamiltonian G, the complement of E covers a matching, perhaps useful in searching for cycles. We also present an example where the WCA fails.

Keywords: Hamilton cycle decision problem, computational complexity theory, graph theory, theoretical computer science

Procedia PDF Downloads 368