Search results for: language practices
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1492

Search results for: language practices

292 Cost of Governance in Nigeria: In Whose Interest?

Authors: Francis O. Iyoha, Daniel E. Gberevbie, Charles T. Iruonagbe, Matthew E. Egharevba

Abstract:

Cost of governance in Nigeria has become a challenge to development and concern to practitioners and scholars alike in the field of business and social science research. In the 2010 national budget of NGN4.6 trillion or USD28.75billion for instance, only a pantry sum of NGN1.8trillion or USD11.15billion was earmarked for capital expenditure. Similarly, in 2013, out of a total national budget of NGN4.92trillion or USD30.75billion, only the sum of NGN1.50trllion or USD9.38billion was voted for capital expenditure. Therefore, based on the data sourced from the Nigerian Office of Statistics, Central bank of Nigeria Statistical Bulletin as well as from the United Nations Development Programme, this study examined the causes of high cost of governance in Nigeria. It found out that the high cost of governance in the country is in the interest of the ruling class, arising from their unethical behaviour – corrupt practices and the poor management of public resources. As a result, the study recommends the need to intensify the war against corruption and mismanagement of public resources by government officials as possible solution to overcome the high cost of governance in Nigeria. This could be achieved by strengthening the constitutional powers of the various anti-corruption agencies in the area of arrest, investigation and prosecution of offenders without the interference of the executive arm of government either at the local, state or federal level.

Keywords: Capital expenditure, Cost of governance, recurrent expenditure, unethical behaviour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3488
291 Creating Shared Value: A Paradigm Shift from Corporate Social Responsibility to Creating Shared Value

Authors: Bolanle Deborah Motilewa, E.K. Rowland Worlu, Gbenga Mayowa Agboola, Marvellous Aghogho Chidinma Gberevbie

Abstract:

Businesses operating in the modern business world are faced with varying challenges; amongst which is the need to ensure that they are performing their societal function of being responsible in the society in which they operate. This responsibility to society is generally termed as corporate social responsibility. For many years, the practice of corporate social responsibility (CSR) was solely philanthropic, where organizations gave ‘charity’ or ‘alms’ to society, without any link to the organization’s mission and objectives. However, there has arisen a shift in the application of CSR from an act of philanthropy to a strategy with a business model engaged in by organizations to create a win-win situation of performing their societal obligation, whilst simultaneously performing their economic obligation. In more recent times, the term has moved from CSR to creating shared value, which is simply corporate policies and practices that enhance the competitiveness of a business organization while simultaneously advancing social and economic conditions in the communities in which the company operates. Creating shared value has in more recent light found more meaning in underdeveloped countries, faced with deep societal challenges that businesses can solve whilst creating economic value. This study thus reviews literature on CSR, conceptualizing the shift to creating shared value and finally viewing its potential significance in Africa’s development.

Keywords: Corporate social responsibility, shared value, Africapitalism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2463
290 Lattice Boltzmann Simulation of Binary Mixture Diffusion Using Modern Graphics Processors

Authors: Mohammad Amin Safi, Mahmud Ashrafizaadeh, Amir Ali Ashrafizaadeh

Abstract:

A highly optimized implementation of binary mixture diffusion with no initial bulk velocity on graphics processors is presented. The lattice Boltzmann model is employed for simulating the binary diffusion of oxygen and nitrogen into each other with different initial concentration distributions. Simulations have been performed using the latest proposed lattice Boltzmann model that satisfies both the indifferentiability principle and the H-theorem for multi-component gas mixtures. Contemporary numerical optimization techniques such as memory alignment and increasing the multiprocessor occupancy are exploited along with some novel optimization strategies to enhance the computational performance on graphics processors using the C for CUDA programming language. Speedup of more than two orders of magnitude over single-core processors is achieved on a variety of Graphical Processing Unit (GPU) devices ranging from conventional graphics cards to advanced, high-end GPUs, while the numerical results are in excellent agreement with the available analytical and numerical data in the literature.

Keywords: Lattice Boltzmann model, Graphical processing unit, Binary mixture diffusion, 2D flow simulations, Optimized algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1519
289 Modelling Phytoremediation Rates of Aquatic Macrophytes in Aquaculture Effluent

Authors: E. A. Kiridi, A. O. Ogunlela

Abstract:

Pollutants from aquacultural practices constitute environmental problems and phytoremediation could offer cheaper environmentally sustainable alternative since equipment using advanced treatment for fish tank effluent is expensive to import, install, operate and maintain, especially in developing countries. The main objective of this research was, therefore, to develop a mathematical model for phytoremediation by aquatic plants in aquaculture wastewater. Other objectives were to evaluate the retention times on phytoremediation rates using the model and to measure the nutrient level of the aquaculture effluent and phytoremediation rates of three aquatic macrophytes, namely; water hyacinth (Eichornia crassippes), water lettuce (Pistial stratoites) and morning glory (Ipomea asarifolia). A completely randomized experimental design was used in the study. Approximately 100 g of each macrophyte were introduced into the hydroponic units and phytoremediation indices monitored at 8 different intervals from the first to the 28th day. The water quality parameters measured were pH and electrical conductivity (EC). Others were concentration of ammonium–nitrogen (NH4+ -N), nitrite- nitrogen (NO2- -N), nitrate- nitrogen (NO3- -N), phosphate –phosphorus (PO43- -P), and biomass value. The biomass produced by water hyacinth was 438.2 g, 600.7 g, 688.2 g and 725.7 g at four 7–day intervals. The corresponding values for water lettuce were 361.2 g, 498.7 g, 561.2 g and 623.7 g and for morning glory were 417.0 g, 567.0 g, 642.0 g and 679.5g. Coefficient of determination was greater than 80% for EC, TDS, NO2- -N, NO3- -N and 70% for NH4+ -N using any of the macrophytes and the predicted values were within the 95% confidence interval of measured values. Therefore, the model is valuable in the design and operation of phytoremediation systems for aquaculture effluent.

Keywords: Phytoremediation, macrophytes, hydroponic unit, aquaculture effluent, mathematical model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1589
288 Developing Rice Disease Analysis System on Mobile via iOS Operating System

Authors: Rujijan Vichivanives, Kittiya Poonsilp, Canasanan Wanavijit

Abstract:

This research aims to create mobile tools to analyze rice disease quickly and easily. The principle of object-oriented software engineering and objective-C language were used for software development methodology and the principle of decision tree technique was used for analysis method. Application users can select the features of rice disease or the color appears on the rice leaves for recognition analysis results on iOS mobile screen. After completing the software development, unit testing and integrating testing method were used to check for program validity. In addition, three plant experts and forty farmers have been assessed for usability and benefit of this system. The overall of users’ satisfaction was found in a good level, 57%. The plant experts give a comment on the addition of various disease symptoms in the database for more precise results of the analysis. For further research, it is suggested that image processing system should be developed as a tool that allows users search and analyze for rice diseases more convenient with great accuracy.

Keywords: Rice disease, analysis system, mobile application, iOS operating system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1245
287 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study

Authors: K. Adu Michael, K. Alese Boniface

Abstract:

Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.

Keywords: Client/Customer, Problem Statement, Requirements Engineering, Software Developers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2424
286 Advanced Geolocation of IP Addresses

Authors: Robert Koch, Mario Golling, Gabi Dreo Rodosek

Abstract:

Tracing and locating the geographical location of users (Geolocation) is used extensively in todays Internet. Whenever we, e.g., request a page from google we are - unless there was a specific configuration made - automatically forwarded to the page with the relevant language and amongst others, dependent on our location identified, specific commercials are presented. Especially within the area of Network Security, Geolocation has a significant impact. Because of the way the Internet works, attacks can be executed from almost everywhere. Therefore, for an attribution, knowledge of the origination of an attack - and thus Geolocation - is mandatory in order to be able to trace back an attacker. In addition, Geolocation can also be used very successfully to increase the security of a network during operation (i.e. before an intrusion actually has taken place). Similar to greylisting in emails, Geolocation allows to (i) correlate attacks detected with new connections and (ii) as a consequence to classify traffic a priori as more suspicious (thus particularly allowing to inspect this traffic in more detail). Although numerous techniques for Geolocation are existing, each strategy is subject to certain restrictions. Following the ideas of Endo et al., this publication tries to overcome these shortcomings with a combined solution of different methods to allow improved and optimized Geolocation. Thus, we present our architecture for improved Geolocation, by designing a new algorithm, which combines several Geolocation techniques to increase the accuracy.

Keywords: IP geolocation, prosecution of computer fraud, attack attribution, target-analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4683
285 A Neuron Model of Facial Recognition and Detection of an Authorized Entity Using Machine Learning System

Authors: J. K. Adedeji, M. O. Oyekanmi

Abstract:

This paper has critically examined the use of Machine Learning procedures in curbing unauthorized access into valuable areas of an organization. The use of passwords, pin codes, user’s identification in recent times has been partially successful in curbing crimes involving identities, hence the need for the design of a system which incorporates biometric characteristics such as DNA and pattern recognition of variations in facial expressions. The facial model used is the OpenCV library which is based on the use of certain physiological features, the Raspberry Pi 3 module is used to compile the OpenCV library, which extracts and stores the detected faces into the datasets directory through the use of camera. The model is trained with 50 epoch run in the database and recognized by the Local Binary Pattern Histogram (LBPH) recognizer contained in the OpenCV. The training algorithm used by the neural network is back propagation coded using python algorithmic language with 200 epoch runs to identify specific resemblance in the exclusive OR (XOR) output neurons. The research however confirmed that physiological parameters are better effective measures to curb crimes relating to identities.

Keywords: Biometric characters, facial recognition, neural network, OpenCV.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 660
284 BugCatcher.Net: Detecting Bugs and Proposing Corrective Solutions

Authors: Sheetal Chavan, P. J. Kulkarni, Vivek Shanbhag

Abstract:

Although achieving zero-defect software release is practically impossible, software industries should take maximum care to detect defects/bugs well ahead in time allowing only bare minimums to creep into released version. This is a clear indicator of time playing an important role in the bug detection. In addition to this, software quality is the major factor in software engineering process. Moreover, early detection can be achieved only through static code analysis as opposed to conventional testing. BugCatcher.Net is a static analysis tool, which detects bugs in .NET® languages through MSIL (Microsoft Intermediate Language) inspection. The tool utilizes a Parser based on Finite State Automata to carry out bug detection. After being detected, bugs need to be corrected immediately. BugCatcher.Net facilitates correction, by proposing a corrective solution for reported warnings/bugs to end users with minimum side effects. Moreover, the tool is also capable of analyzing the bug trend of a program under inspection.

Keywords: Dependence, Early solution, Finite State Automata, Grammar, Late solution, Parser State Transition Diagram, StaticProgram Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1476
283 Fundamental Variables of Final Account Closing Success in Construction Projects in Malaysia

Authors: Zarabizan Zakaria, Syuhaida Ismail, Aminah Md Yusof

Abstract:

Project management process starts from the planning stage up to the stage of completion (handover of buildings, preparation of the final accounts and the closing balance). Seeing as this process is not easy to be implemented efficiently and effectively, the issue of unsuccessful delivery as per contract in construction has become a major problem for construction projects. These issues have been blamed mainly on inefficient traditional construction practices that continue to dominate the current industry. This is due to several factors, such as environments of construction technology, sophisticated design and customer demand, that are constantly changing and influencing, either directly or indirectly, to the practice of management. Among the identified influences are physical environment, social environment, information environment, political and moral atmosphere. Therefore, this paper is emerged to determine the fundamental variables in the final account closing success in construction project. This aim can be achieved via its objectives of identifying the key constraints to the closing of final accounts in construction projects in Malaysia, investigating solutions to the identified constraints and analysing the relative levels of impact of the identified constraints. It is expected that this paper provides effective measures to avoid or at least reduce the problems in final account closing to the optimum level. It is also anticipated that the finding or outcome reported in this paper could address the unsuccessful contributors in final account closing and define tools for their mitigation for the better development of construction project.

Keywords: Fundamental variables, closing of final account, construction project, Malaysia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3819
282 A New Computational Tool for Noise Prediction of Rotating Surfaces (FACT)

Authors: Ana Vieira, Fernando Lau, João Pedro Mortágua, Luís Cruz, Rui Santos

Abstract:

The air transport impact on environment is more than ever a limitative obstacle to the aeronautical industry continuous growth. Over the last decades, considerable effort has been carried out in order to obtain quieter aircraft solutions, whether by changing the original design or investigating more silent maneuvers. The noise propagated by rotating surfaces is one of the most important sources of annoyance, being present in most aerial vehicles. Bearing this is mind, CEIIA developed a new computational chain for noise prediction with in-house software tools to obtain solutions in relatively short time without using excessive computer resources. This work is based on the new acoustic tool, which aims to predict the rotor noise generated during steady and maneuvering flight, making use of the flexibility of the C language and the advantages of GPU programming in terms of velocity. The acoustic tool is based in the Formulation 1A of Farassat, capable of predicting two important types of noise: the loading and thickness noise. The present work describes the most important features of the acoustic tool, presenting its most relevant results and framework analyses for helicopters and UAV quadrotors.

Keywords: Rotor noise, acoustic tool, GPU Programming, UAV noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2027
281 Maternal Smoking and Risk of Childhood Overweight and Obesity: A Meta-Analysis

Authors: Martina Kanciruk, Jac W. Andrews, Tyrone Donnon

Abstract:

The purpose of this study was to determine the significance of maternal smoking for the development of childhood overweight and/or obesity. Accordingly, a systematic literature review of English-language studies published from 1980 to 2012 using the following data bases: MEDLINE, PsychINFO, Cochrane Database of Systematic Reviews, and Dissertation Abstracts International was conducted. The following terms were used in the search: pregnancy, overweight, obesity, smoking, parents, childhood, risk factors. Eighteen studies of maternal smoking during pregnancy and obesity conducted in Europe, Asia, North America, and South America met the inclusion criteria. A meta-analysis of these studies indicated that maternal smoking during pregnancy is a significant risk factor for overweight and obesity; mothers who smoke during pregnancy are at a greater risk for developing obesity or overweight; the quantity of cigarettes consumed by the mother during pregnancy influenced the odds of offspring overweight and/or obesity. In addition, the results from moderator analyses suggest that part of the heterogeneity discovered between the studies can be explained by the region of world that the study occurred in and the age of the child at the time of weight assessment.

Keywords: Childhood obesity, overweight, smoking, parents, risk factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1996
280 A Recognition Method of Ancient Yi Script Based on Deep Learning

Authors: Shanxiong Chen, Xu Han, Xiaolong Wang, Hui Ma

Abstract:

Yi is an ethnic group mainly living in mainland China, with its own spoken and written language systems, after development of thousands of years. Ancient Yi is one of the six ancient languages in the world, which keeps a record of the history of the Yi people and offers documents valuable for research into human civilization. Recognition of the characters in ancient Yi helps to transform the documents into an electronic form, making their storage and spreading convenient. Due to historical and regional limitations, research on recognition of ancient characters is still inadequate. Thus, deep learning technology was applied to the recognition of such characters. Five models were developed on the basis of the four-layer convolutional neural network (CNN). Alpha-Beta divergence was taken as a penalty term to re-encode output neurons of the five models. Two fully connected layers fulfilled the compression of the features. Finally, at the softmax layer, the orthographic features of ancient Yi characters were re-evaluated, their probability distributions were obtained, and characters with features of the highest probability were recognized. Tests conducted show that the method has achieved higher precision compared with the traditional CNN model for handwriting recognition of the ancient Yi.

Keywords: Recognition, CNN, convolutional neural network, Yi character, divergence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 702
279 Corporate Cautionary Statement: A Genre of Professional Communication

Authors: Chie Urawa

Abstract:

Cautionary statements or disclaimers in corporate annual reports need to be carefully designed because clear cautionary statements may protect a company in the case of legal disputes and may undermine positive impressions. This study compares the language of cautionary statements using two corpora, Sony’s cautionary statement corpus (S-corpus) and Panasonic’s cautionary statement corpus (P-corpus), illustrating the differences and similarities in relation to the use of meaningful cautionary statements and critically analyzing why practitioners use the way. The findings describe the distinct differences between the two companies in the presentation of the risk factors and the way how they make the statements. The word ability is used more for legal protection in S-corpus whereas the word possibility is used more to convey a better impression in P-corpus. The main similarities are identified in the use of lexical words and pronouns, and almost the same wordings for eight years. The findings show how they make the statements unique to the company in the presentation of risk factors, and the characteristics of specific genre of professional communication. Important implications of this study are that more comprehensive approach can be applied in other contexts, and be used by companies to reflect upon their cautionary statements.

Keywords: Cautionary statements, corporate annual reports, corpus, risk factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 821
278 Topology of Reverse Von-Kármán Vortex Street in the Wake of a Swimming Whale Shark

Authors: Arash Taheri

Abstract:

In this paper, effects of the ventral body planform of a swimming whale shark on the formation of ‘reverse von-Kármán vortex street’ behind the aquatic animal are studied using Fluid-Structure Interaction (FSI) approach. In this regard, incompressible Navier-Stokes equations around the whale shark’s body with a prescribed deflection dynamics are solved with the aid of Boundary Data Immersion Method (BDIM) and Implicit Large Eddy Simulation (ILES) turbulence treatment by WaterLily.jl solver; fully-written in Julia programming language. The whale shark flow simulations here are performed at high Reynolds number, i.e. 1.4 107 corresponding to the swimming of a 10 meter-whale shark at an average speed of 5 km/h. For comparison purposes, vortical flow generation behind a silky shark with a streamlined forehead eidonomy is also simulated at high Reynolds number, Re = 2 106, corresponding to the swimming of a 2 meter-silky shark at an average speed of 3.6 km/h. The results depict formation of distinct wake topologies behind the swimming sharks depending on the travelling wave oscillating amplitudes.

Keywords: Whale shark, vortex street, BDIM, FSI, functional eidonomy, bionics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1035
277 An Enhanced Distributed System to improve theTime Complexity of Binary Indexed Trees

Authors: Ahmed M. Elhabashy, A. Baes Mohamed, Abou El Nasr Mohamad

Abstract:

Distributed Computing Systems are usually considered the most suitable model for practical solutions of many parallel algorithms. In this paper an enhanced distributed system is presented to improve the time complexity of Binary Indexed Trees (BIT). The proposed system uses multi-uniform processors with identical architectures and a specially designed distributed memory system. The analysis of this system has shown that it has reduced the time complexity of the read query to O(Log(Log(N))), and the update query to constant complexity, while the naive solution has a time complexity of O(Log(N)) for both queries. The system was implemented and simulated using VHDL and Verilog Hardware Description Languages, with xilinx ISE 10.1, as the development environment and ModelSim 6.1c, similarly as the simulation tool. The simulation has shown that the overhead resulting by the wiring and communication between the system fragments could be fairly neglected, which makes it applicable to practically reach the maximum speed up offered by the proposed model.

Keywords: Binary Index Tree (BIT), Least Significant Bit (LSB), Parallel Adder (PA), Very High Speed Integrated Circuits HardwareDescription Language (VHDL), Distributed Parallel Computing System(DPCS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
276 Water Resources Crisis in Saudi Arabia, Challenges and Possible Management Options: An Analytic Review

Authors: A. A. Ghanim

Abstract:

The Kingdom of Saudi Arabia (KSA) is heading towards a severe and rapidly expanding water crisis, which can have negative impacts on the country’s environment and economy. Of the total water consumption in KSA, the agricultural sector accounts for nearly 87% of the total water use and, therefore, any attempt that overlooks this sector will not help in improving the sustainability of the country’s water resources. KSA Vision 2030 gives priority of water use in the agriculture sector for the regions that have natural renewable water resources. It means that there is little concern for making reuse of municipal wastewater for irrigation purposes in any region in general and in water-scarce regions in particular. The use of treated wastewater is very limited in Saudi Arabia, but it has very considerable potential for future expansion due its numerous beneficial uses. This study reviews the current situation of water resources in Saudi Arabia, providing more highlights on agriculture and wastewater reuse. The reviewed study is proposing some corrective measures for development and better management of water resources in the Kingdom. Suggestions also include consideration of treated water as an alternative source for irrigation in some regions of the country. The study concluded that a sustainable solution for the water crisis in KSA requires implementation of multiple measures in an integrated manner. The integrated solution plan should focus on two main directions: first, improving the current management practices of the existing water resources; second, developing new water supplies from both conventional and non-conventional sources.

Keywords: Saudi Arabia, water resources, water crisis, treated wastewater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885
275 Towards Model-Driven Communications

Authors: Antonio Natali, Ambra Molesini

Abstract:

In modern distributed software systems, the issue of communication among composing parts represents a critical point, but the idea of extending conventional programming languages with general purpose communication constructs seems difficult to realize. As a consequence, there is a (growing) gap between the abstraction level required by distributed applications and the concepts provided by platforms that enable communication. This work intends to discuss how the Model Driven Software Development approach can be considered as a mature technology to generate in automatic way the schematic part of applications related to communication, by providing at the same time high level specialized languages useful in all the phases of software production. To achieve the goal, a stack of languages (meta-meta¬models) has been introduced in order to describe – at different levels of abstraction – the collaborative behavior of generic entities in terms of communication actions related to a taxonomy of messages. Finally, the generation of platforms for communication is viewed as a form of specification of language semantics, that provides executable models of applications together with model-checking supports and effective runtime environments.

Keywords: Interactions, specific languages, meta-models, model driven development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820
274 Dependability Tools in Multi-Agent Support for Failures Analysis of Computer Networks

Authors: Myriam Noureddine

Abstract:

During their activity, all systems must be operational without failures and in this context, the dependability concept is essential avoiding disruption of their function. As computer networks are systems with the same requirements of dependability, this article deals with an analysis of failures for a computer network. The proposed approach integrates specific tools of the plat-form KB3, usually applied in dependability studies of industrial systems. The methodology is supported by a multi-agent system formed by six agents grouped in three meta agents, dealing with two levels. The first level concerns a modeling step through a conceptual agent and a generating agent. The conceptual agent is dedicated to the building of the knowledge base from the system specifications written in the FIGARO language. The generating agent allows producing automatically both the structural model and a dependability model of the system. The second level, the simulation, shows the effects of the failures of the system through a simulation agent. The approach validation is obtained by its application on a specific computer network, giving an analysis of failures through their effects for the considered network.

Keywords: Computer network, dependability, KB3 plat-form, multi-agent system, failure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 604
273 The Implementation of Anti-Circumvention Legislations in Thai Copyright System

Authors: Chuencheewin Yimfuang

Abstract:

The WIPO copyright treaty (WCT) was established by the World Intellectual Property Organisation (WIPO). This agreement required the contracting nations to provide adequate protection to technological measures to prevent massive copyright infringement in the internet system. Thailand had to implement the anti-circumvention rules into domestic legislation to comply with this international obligation. The purpose of this paper is to critically discuss the legislative standard under the WCT. It also aims to examine the legal development of technological protection measures in Thailand and demonstrate that the scope of prohibitions under the copyright Act 2022 (NO.5) is similar to the Digital Millennium Copyright Act 1998 (DMCA) of the United States (US). It could be found that the anti-circumvention laws of Thailand prohibit the circumvention of access-control technologies, and the regulation on trafficking circumvention devices has been added to the latest version of the Thai Copyright Act. These legislative evolutions have revealed the attempt to reinforce the legal protection of technological measures and copyright holders in order to be in line with global practices. However, the amendment has problems concerning the legal definitions of effective technological measure and the prohibited act of circumvention. The vagueness might affect the scope of protection and the boundary of prohibition. With this aspect, the DMCA will be evaluated and compared to gain guidelines for interpretation and enforcement in Thailand. The lessons and experiences learned from this study might be useful to correct the flaws or at least clarify the ambiguities embodied in Thai copyright legislation.

Keywords: Legal Development Technological Protection Measure, prohibition, circumvention, Thailand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 122
272 Iranian Bazaars: The Illustration of Stable Thoughts

Authors: Aida Amirazodi

Abstract:

"Bazaar" is a Persian word from the language of Iranians of 2500 years ago which has entered the languages of other countries. “Bazaar", the trading or marketing place with the architectural principles and concerns, was formed in Iran because of the long experience of marketing. This has become a valuable inheritance of Islamic ideological civilization and Iranian advanced architecture and a model of Islamic-marketing places with spectacular elements and parts, and the place for economical, social and cultural exchanges. “Bazaars" are found in cities of Iran and many Islamic countries in west of Asia and north of Africa. With the stable structure and function as a symbol of social values, this place has become the economic center and the illustration of stable architecture and advanced principles. “Bazaars" as the heart of Iranian cities economy with several major and minor rows of shops, in closed and open areas, along a fixed line or branches with beautiful arcs, patios, and frameworks are among the main national inheritance of Iran and one of the important Iranian architectural treasures because of its Iranian nobility.

Keywords: Traditional Bazaar, Form of Bazaar, Iranian Architecture

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1707
271 Family History of Obesity and Risk of Childhood Overweight and Obesity: A Meta-Analysis

Authors: Martina Kanciruk, Jac W. Andrews, Tyrone Donnon

Abstract:

The purpose of this study was to determine the significance of history of obesity for the development of childhood overweight and/or obesity. Accordingly, a systematic literature review of English-language studies published from 1980 to 2012 using the following data bases: MEDLINE, PsychINFO, Cochrane Database of Systematic Reviews, and Dissertation Abstracts International was conducted. The following terms were used in the search: pregnancy, overweight, obesity, family history, parents, childhood, risk factors. Eleven studies of family history and obesity conducted in Europe, Asia, North America, and South America met the inclusion criteria. A meta-analysis of these studies indicated that family history of obesity is a significant risk factor of overweight and /or obesity in offspring; risk for offspring overweight and/or obesity associated with family history varies depending of the family members included in the analysis; and when family history of obesity is present, the offspring are at greater risk for developing obesity or overweight. In addition, the results from moderator analyses suggest that part of the heterogeneity discovered between the studies can be explained by the region of world that the study occurred in and the age of the child at the time of weight assessment.

Keywords: Childhood obesity, overweight, family history, risk factors, meta-analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3505
270 Digital Encoder Based Power Frequency Deviation Measurement

Authors: Syed Javed Arif, Mohd Ayyub Khan, Saleem Anwar Khan

Abstract:

In this paper, a simple method is presented for measurement of power frequency deviations. A phase locked loop (PLL) is used to multiply the signal under test by a factor of 100. The number of pulses in this pulse train signal is counted over a stable known period, using decade driving assemblies (DDAs) and flip-flops. These signals are combined using logic gates and then passed through decade counters to give a unique combination of pulses or levels, which are further encoded. These pulses are equally suitable for both control applications and display units. The experimental circuit developed gives a resolution of 1 Hz within the measurement period of 20 ms. The proposed circuit is also simulated in Verilog Hardware Description Language (VHDL) and implemented using Field Programing Gate Arrays (FPGAs). A Mixed signal Oscilloscope (MSO) is used to observe the results of FPGA implementation. These results are compared with the results of the proposed circuit of discrete components. The proposed system is useful for frequency deviation measurement and control in power systems.

Keywords: Frequency measurement, digital control, phase locked loop, encoding, Verilog HDL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 584
269 Automatic Translation of Ada-ECATNet Using Rewriting Logic

Authors: N. Boudiaf

Abstract:

One major difficulty that faces developers of concurrent and distributed software is analysis for concurrency based faults like deadlocks. Petri nets are used extensively in the verification of correctness of concurrent programs. ECATNets are a category of algebraic Petri nets based on a sound combination of algebraic abstract types and high-level Petri nets. ECATNets have 'sound' and 'complete' semantics because of their integration in rewriting logic and its programming language Maude. Rewriting logic is considered as one of very powerful logics in terms of description, verification and programming of concurrent systems We proposed previously a method for translating Ada-95 tasking programs to ECATNets formalism (Ada-ECATNet) and we showed that ECATNets formalism provides a more compact translation for Ada programs compared to the other approaches based on simple Petri nets or Colored Petri nets. We showed also previously how the ECATNet formalism offers to Ada many validation and verification tools like simulation, Model Checking, accessibility analysis and static analysis. In this paper, we describe the implementation of our translation of the Ada programs into ECATNets.

Keywords: Ada tasking, Analysis, Automatic Translation, ECATNets, Maude, Rewriting Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
268 Impact of VARK Learning Model at Tertiary Level Education

Authors: Munazza A. Mirza, Khawar Khurshid

Abstract:

Individuals are generally associated with different learning styles, which have been explored extensively in recent past. The learning styles refer to the potential of an individual by which s/he can easily comprehend and retain information. Among various learning style models, VARK is the most accepted model which categorizes the learners with respect to their sensory characteristics. Based on the number of preferred learning modes, the learners can be categorized as uni-modal, bi-modal, tri-modal, or quad/multi-modal. Although there is a prevalent belief in the learning styles, however, the model is not being frequently and effectively utilized in the higher education. This research describes the identification model to validate teacher’s didactic practice and student’s performance linkage with the learning styles. The identification model is recommended to check the effective application and evaluation of the various learning styles. The proposed model is a guideline to effectively implement learning styles inventory in order to ensure that it will validate performance linkage with learning styles. If performance is linked with learning styles, this may help eradicate the distrust on learning style theory. For this purpose, a comprehensive study was conducted to compare and understand how VARK inventory model is being used to identify learning preferences and their correlation with learner’s performance. A comparative analysis of the findings of these studies is presented to understand the learning styles of tertiary students in various disciplines. It is concluded with confidence that the learning styles of students cannot be associated with any specific discipline. Furthermore, there is not enough empirical proof to link performance with learning styles.

Keywords: Learning style, VARK, sensory preferences, identification model, didactic practices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5247
267 Semi-Automatic Analyzer to Detect Authorial Intentions in Scientific Documents

Authors: Kanso Hassan, Elhore Ali, Soule-dupuy Chantal, Tazi Said

Abstract:

Information Retrieval has the objective of studying models and the realization of systems allowing a user to find the relevant documents adapted to his need of information. The information search is a problem which remains difficult because the difficulty in the representing and to treat the natural languages such as polysemia. Intentional Structures promise to be a new paradigm to extend the existing documents structures and to enhance the different phases of documents process such as creation, editing, search and retrieval. The intention recognition of the author-s of texts can reduce the largeness of this problem. In this article, we present intentions recognition system is based on a semi-automatic method of extraction the intentional information starting from a corpus of text. This system is also able to update the ontology of intentions for the enrichment of the knowledge base containing all possible intentions of a domain. This approach uses the construction of a semi-formal ontology which considered as the conceptualization of the intentional information contained in a text. An experiments on scientific publications in the field of computer science was considered to validate this approach.

Keywords: Information research, text analyzes, intentionalstructure, segmentation, ontology, natural language processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1613
266 Health Risk Assessment of Heavy Metals in the Contaminated and Uncontaminated Soils

Authors: S. A. Nta

Abstract:

Application of health risk assessment methods is important in order to comprehend the risk of human exposure to heavy metals and other dangerous pollutants. Four soil samples were collected at distances of 10, 20, 30 m and the control 100 m away from the dump site at depths of 0.3, 0.6 and 0.9 m. The collected soil samples were examined for Zn, Cu, Pb, Cd and Ni using standard methods. The health risks via the main pathways of human exposure to heavy metal were detected using relevant standard equations. Hazard quotient was calculated to determine non-carcinogenic health risk for each individual heavy metal. Life time cancer risk was calculated to determine the cumulative life cancer rating for each exposure pathway. The estimated health risk values for adults and children were generally lower than the reference dose. The calculated hazard quotient for the ingestion, inhalation and dermal contact pathways were less than unity. This means that there is no detrimental concern to the health on human exposure to heavy metals in contaminated soil. The life time cancer risk 5.4 × 10-2 was higher than the acceptable threshold value of 1 × 10-4 which is reflected to have significant health effects on human exposure to heavy metals in contaminated soil. Good hygienic practices are recommended to ease the potential risk to children and adult who are exposed to contaminated soils. Also, the local authorities should be made aware of such health risks for the purpose of planning the management strategy accordingly.

Keywords: Health risk assessment, pollution, heavy metals, soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1100
265 Rank-Based Chain-Mode Ensemble for Binary Classification

Authors: Chongya Song, Kang Yen, Alexander Pons, Jin Liu

Abstract:

In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve.

Keywords: Consensus, curse of correlation, imbalanced classification, rank-based chain-mode ensemble.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 682
264 FPGA Based Parallel Architecture for the Computation of Third-Order Cross Moments

Authors: Syed Manzoor Qasim, Shuja Abbasi, Saleh Alshebeili, Bandar Almashary, Ateeq Ahmad Khan

Abstract:

Higher-order Statistics (HOS), also known as cumulants, cross moments and their frequency domain counterparts, known as poly spectra have emerged as a powerful signal processing tool for the synthesis and analysis of signals and systems. Algorithms used for the computation of cross moments are computationally intensive and require high computational speed for real-time applications. For efficiency and high speed, it is often advantageous to realize computation intensive algorithms in hardware. A promising solution that combines high flexibility together with the speed of a traditional hardware is Field Programmable Gate Array (FPGA). In this paper, we present FPGA-based parallel architecture for the computation of third-order cross moments. The proposed design is coded in Very High Speed Integrated Circuit (VHSIC) Hardware Description Language (VHDL) and functionally verified by implementing it on Xilinx Spartan-3 XC3S2000FG900-4 FPGA. Implementation results are presented and it shows that the proposed design can operate at a maximum frequency of 86.618 MHz.

Keywords: Cross moments, Cumulants, FPGA, Hardware Implementation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700
263 Learning Outcomes Alignment across Engineering Core Courses

Authors: A. Bouabid, B. Bielenberg, S. Ainane, N. Pasha

Abstract:

In this paper, a team of faculty members of the Petroleum Institute in Abu Dhabi, UAE representing six different courses across General Engineering (ENGR), Communication (COMM), and Design (STPS) worked together to establish a clear developmental progression of learning outcomes and performance indicators for targeted knowledge, areas of competency, and skills for the first three semesters of the Bachelor of Sciences in Engineering curriculum. The sequences of courses studied in this project were ENGR/COMM, COMM/STPS, and ENGR/STPS. For each course’s nine areas of knowledge, competency, and skills, the research team reviewed the existing learning outcomes and related performance indicators with a focus on identifying linkages across disciplines as well as within the courses of a discipline. The team reviewed existing performance indicators for developmental progression from semester to semester for same discipline related courses (vertical alignment) and for different discipline courses within the same semester (horizontal alignment). The results of this work have led to recommendations for modifications of the initial indicators when incoherence was identified, and/or for new indicators based on best practices (identified through literature searches) when gaps were identified. It also led to recommendations for modifications of the level of emphasis within each course to ensure developmental progression. The exercise has led to a revised Sequence Performance Indicator Mapping for the knowledge, skills, and competencies across the six core courses.

Keywords: Curriculum alignment, horizontal and vertical progression, performance indicators, skill level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 810