Search results for: network performance
4365 Machine Learning Approach for Mutation Testing
Authors: Michael Stewart
Abstract:
Mutation testing is a type of software testing proposed in the 1970s where program statements are deliberately changed to introduce simple errors so that test cases can be validated to determine if they can detect the errors. Test cases are executed against the mutant code to determine if one fails, detects the error and ensures the program is correct. One major issue with this type of testing was it became intensive computationally to generate and test all possible mutations for complex programs. This paper used reinforcement learning and parallel processing within the context of mutation testing for the selection of mutation operators and test cases that reduced the computational cost of testing and improved test suite effectiveness. Experiments were conducted using sample programs to determine how well the reinforcement learning-based algorithm performed with one live mutation, multiple live mutations and no live mutations. The experiments, measured by mutation score, were used to update the algorithm and improved accuracy for predictions. The performance was then evaluated on multiple processor computers. With reinforcement learning, the mutation operators utilized were reduced by 50 – 100%.Keywords: automated-testing, machine learning, mutation testing, parallel processing, reinforcement learning, software engineering, software testing
Procedia PDF Downloads 1984364 Analysis of the Homogeneous Turbulence Structure in Uniformly Sheared Bubbly Flow Using First and Second Order Turbulence Closures
Authors: Hela Ayeb Mrabtini, Ghazi Bellakhal, Jamel Chahed
Abstract:
The presence of the dispersed phase in gas-liquid bubbly flow considerably alters the liquid turbulence. The bubbles induce turbulent fluctuations that enhance the global liquid turbulence level and alter the mechanisms of turbulence. RANS modeling of uniformly sheared flows on an isolated sphere centered in a control volume is performed using first and second order turbulence closures. The sphere is placed in the production-dissipation equilibrium zone where the liquid velocity is set equal to the relative velocity of the bubbles. The void fraction is determined by the ratio between the sphere volume and the control volume. The analysis of the turbulence statistics on the control volume provides numerical results that are interpreted with regard to the effect of the bubbles wakes on the turbulence structure in uniformly sheared bubbly flow. We assumed for this purpose that at low void fraction where there is no hydrodynamic interaction between the bubbles, the single-phase flow simulation on an isolated sphere is representative on statistical average of a sphere network. The numerical simulations were firstly validated against the experimental data of bubbly homogeneous turbulence with constant shear and then extended to produce numerical results for a wide range of shear rates from 0 to 10 s^-1. These results are compared with our turbulence closure proposed for gas-liquid bubbly flows. In this closure, the turbulent stress tensor in the liquid is split into a turbulent dissipative part produced by the gradient of the mean velocity which also contains the turbulence generated in the bubble wakes and a pseudo-turbulent non-dissipative part induced by the bubbles displacements. Each part is determined by a specific transport equation. The simulations of uniformly sheared flows on an isolated sphere reproduce the mechanisms related to the turbulent part, and the numerical results are in perfect accordance with the modeling of the transport equation of the turbulent part. The reduction of second order turbulence closure provides a description of the modification of turbulence structure by the bubbles presence using a dimensionless number expressed in terms of two-time scales characterizing the turbulence induced by the shear and that induced by bubbles displacements. The numerical simulations carried out in the framework of a comprehensive analysis reproduce particularly the attenuation of the turbulent friction showed in the experimental results of bubbly homogeneous turbulence subjected to a constant shear.Keywords: gas-liquid bubbly flows, homogeneous turbulence, turbulence closure, uniform shear
Procedia PDF Downloads 4614363 Criminal Law and Internet of Things: Challenges and Threats
Authors: Celina Nowak
Abstract:
The development of information and communication technologies (ICT) and a consequent growth of cyberspace have become a reality of modern societies. The newest addition to this complex structure has been Internet of Things which is due to the appearance of smart devices. IoT creates a new dimension of the network, as the communication is no longer the domain of just humans, but has also become possible between devices themselves. The possibility of communication between devices, devoid of human intervention and real-time supervision, generated new societal and legal challenges. Some of them may and certainly will eventually be connected to criminal law. Legislators both on national and international level have been struggling to cope with this technologically evolving environment in order to address new threats created by the ICT. There are legal instruments on cybercrime, however imperfect and not of universal scope, sometimes referring to specific types of prohibited behaviors undertaken by criminals, such as money laundering, sex offences. However, the criminal law seems largely not prepared to the challenges which may arise because of the development of IoT. This is largely due to the fact that criminal law, both on national and international level, is still based on the concept of perpetration of an offence by a human being. This is a traditional approach, historically and factually justified. Over time, some legal systems have developed or accepted the possibility of commission of an offence by a corporation, a legal person. This is in fact a legal fiction, as a legal person cannot commit an offence as such, it needs humans to actually behave in a certain way on its behalf. Yet, the legislators have come to understand that corporations have their own interests and may benefit from crime – and therefore need to be penalized. This realization however has not been welcome by all states and still give rise to doubts of ontological and theoretical nature in many legal systems. For this reason, in many legislations the liability of legal persons for commission of an offence has not been recognized as criminal responsibility. With the technological progress and the growing use of IoT the discussions referring to criminal responsibility of corporations seem rather inadequate. The world is now facing new challenges and new threats related to the ‘smart’ things. They will have to be eventually addressed by legislators if they want to, as they should, to keep up with the pace of technological and societal evolution. This will however require a reevaluation and possibly restructuring of the most fundamental notions of modern criminal law, such as perpetration, guilt, participation in crime. It remains unclear at this point what norms and legal concepts will be and may be established. The main goal of the research is to point out to the challenges ahead of the national and international legislators in the said context and to attempt to formulate some indications as to the directions of changes, having in mind serious threats related to privacy and security related to the use of IoT.Keywords: criminal law, internet of things, privacy, security threats
Procedia PDF Downloads 1634362 Three-Stage Anaerobic Co-digestion of High-Solids Food Waste and Horse Manure
Authors: Kai-Chee Loh, Jingxin Zhang, Yen-Wah Tong
Abstract:
Hydrolysis and acidogenesis are the rate-controlling steps in an anaerobic digestion (AD) process. Considering that the optimum conditions for each stage can be diverse diverse, the development of a multi-stage AD system is likely to the AD efficiency through individual optimization. In this research, we developed a highly integrate three-stage anaerobic digester (HM3) to combine the advantages of dry AD and wet AD for anaerobic co-digestion of food waste and horse manure. The digester design comprised mainly of three chambers - high-solids hydrolysis, high-solids acidogenesis and wet methanogensis. Through comparing the treatment performance with other two control digesters, HM3 presented 11.2 ~22.7% higher methane yield. The improved methane yield was mainly attributed to the functionalized partitioning in the integrated digester, which significantly accelerated the solubilization of solid organic matters and the formation of organic acids, as well as ammonia in the high-solids hydrolytic and acidogenic stage respectively. Additionally, HM3 also showed the highest volatile solids reduction rate among the three digesters. Real-time PCR and pyrosequencing analysis indicated that the abundance and biodiversity of microorganisms including bacteria and archaea in HM3 was much higher than that in the control reactors.Keywords: anaerobic digestion, high-solids, food waste and horse manure, microbial community
Procedia PDF Downloads 4144361 Multi-Granularity Feature Extraction and Optimization for Pathological Speech Intelligibility Evaluation
Authors: Chunying Fang, Haifeng Li, Lin Ma, Mancai Zhang
Abstract:
Speech intelligibility assessment is an important measure to evaluate the functional outcomes of surgical and non-surgical treatment, speech therapy and rehabilitation. The assessment of pathological speech plays an important role in assisting the experts. Pathological speech usually is non-stationary and mutational, in this paper, we describe a multi-granularity combined feature schemes, and which is optimized by hierarchical visual method. First of all, the difference granularity level pathological features are extracted which are BAFS (Basic acoustics feature set), local spectral characteristics MSCC (Mel s-transform cepstrum coefficients) and nonlinear dynamic characteristics based on chaotic analysis. Latterly, radar chart and F-score are proposed to optimize the features by the hierarchical visual fusion. The feature set could be optimized from 526 to 96-dimensions.The experimental results denote that new features by support vector machine (SVM) has the best performance, with a recognition rate of 84.4% on NKI-CCRT corpus. The proposed method is thus approved to be effective and reliable for pathological speech intelligibility evaluation.Keywords: pathological speech, multi-granularity feature, MSCC (Mel s-transform cepstrum coefficients), F-score, radar chart
Procedia PDF Downloads 2834360 Towards Reliable Mobile Cloud Computing
Authors: Khaled Darwish, Islam El Madahh, Hoda Mohamed, Hadia El Hennawy
Abstract:
Cloud computing has been one of the fastest growing parts in IT industry mainly in the context of the future of the web where computing, communication, and storage services are main services provided for Internet users. Mobile Cloud Computing (MCC) is gaining stream which can be used to extend cloud computing functions, services and results to the world of future mobile applications and enables delivery of a large variety of cloud application to billions of smartphones and wearable devices. This paper describes reliability for MCC by determining the ability of a system or component to function correctly under stated conditions for a specified period of time to be able to deal with the estimation and management of high levels of lifetime engineering uncertainty and risks of failure. The assessment procedures consists of determine Mean Time between Failures (MTBF), Mean Time to Failure (MTTF), and availability percentages for main components in both cloud computing and MCC structures applied on single node OpenStack installation to analyze its performance with different settings governing the behavior of participants. Additionally, we presented several factors have a significant impact on rates of change overall cloud system reliability should be taken into account in order to deliver highly available cloud computing services for mobile consumers.Keywords: cloud computing, mobile cloud computing, reliability, availability, OpenStack
Procedia PDF Downloads 3984359 Ecolabelling : Normative Power or Corporate Strategy? : A Study Case of Textile Company in Indonesia
Authors: Suci Lestari Yuana, Shofi Fatihatun Sholihah, Derarika Ensta Jesse
Abstract:
Textile is one of buyer-driven industry which rely on label trust from the consumers. Most of textile manufacturers produce textile and textile products based on consumer demands. The company’s policy is highly depend on the dynamic evolution of consumers behavior. Recently, ecofriendly has become one of the most important factor of western consumers to purchase the textile and textile product (TPT) from the company. In that sense, companies from developing countries are encouraged to follow western consumers values. Some examples of ecolabel certificate are ISO (International Standard Organisation), Lembaga Ekolabel Indonesia (Indonesian Ecolabel Instution) and Global Ecolabel Network (GEN). The submission of national company to international standard raised a critical question whether this is a reflection towards the legitimation of global norms into national policy or it is actually a practical strategy of the company to gain global consumer. By observing one of the prominent textile company in Indonesia, this research is aimed to discuss what kind of impetus factors that cause a company to use ecolabel and what is the meaning behind it. Whether it comes from normative power or the strategy of the company. This is a qualitative research that choose a company in Sukoharjo, Central Java, Indonesia as a case study in explaining the pratice of ecolabelling by textitle company. Some deep interview is conducted with the company in order to get to know the ecolabelling process. In addition, this research also collected some document which related to company’s ecolabelling process and its impact to company’s value. The finding of the project reflected issues that concerned several issues: (1) role of media as consumer information (2) role of government and non-government actors as normative agency (3) role of company in social responsibility (4) the ecofriendly consciousness as a value of the company. As we know that environmental norms that has been admitted internationally has changed the global industrial process. This environmental norms also pushed the companies around the world, especially the company in Sukoharjo, Central Java, Indonesia to follow the norm. The neglection toward the global norms will remained the company in isolated and unsustained market that will harm the continuity of the company. So, in buyer-driven industry, the characteristic of company-consumer relations has brought a fast dynamic evolution of norms and values. The creation of global norms and values is circulated by passing national territories or identities.Keywords: ecolabeling, waste management, CSR, normative power
Procedia PDF Downloads 3064358 Simulation-Based Investigation of Ferroresonance in Different Transformer Configurations
Authors: George Eduful, Yuanyuan Fan, Ahmed Abu-Siada
Abstract:
Ferroresonance poses a substantial threat to the quality and reliability of power distribution systems due to its inherent characteristics of sustained overvoltages and currents. This paper aims to enhance the understanding and reduce the ferroresonance threat by investigating the susceptibility of different transformer configurations using MATLAB/Simulink simulations. To achieve this, four 200 kVA transformers with different vector groups (D-Yn, Yg-Yg, Yn-Yn, and Y-D11) and core types (3-limb, 5-limb, single-phase) were systematically exposed to controlled ferroresonance conditions. The impact of varying the length of the 11 kV cable connected to the transformers was also examined. Through comprehensive voltage, current, and total harmonic distortion analyses, the performance of each configuration was evaluated and compared. The results of the study indicate that transformers with Y-D11 and Yg-Yg configurations exhibited lower susceptibility to ferroresonance, in comparison to those with D-Y11 and Yg-Yg configurations. This implies that the Y-D11 and Yg-Yg transformers are better suited for applications with high risks of ferroresonance. The insights provided by this study are of significant value for the strategic selection and deployment of transformers in power systems, particularly in settings prone to ferroresonance. By identifying and recommending transformer configurations that demonstrate better resilience, this paper contributes to enhancing the overall robustness and reliability of power grid infrastructure.Keywords: about cable-connected, core type, ferroresonance, over voltages, power transformer, vector group
Procedia PDF Downloads 404357 Different Data-Driven Bivariate Statistical Approaches to Landslide Susceptibility Mapping (Uzundere, Erzurum, Turkey)
Authors: Azimollah Aleshzadeh, Enver Vural Yavuz
Abstract:
The main goal of this study is to produce landslide susceptibility maps using different data-driven bivariate statistical approaches; namely, entropy weight method (EWM), evidence belief function (EBF), and information content model (ICM), at Uzundere county, Erzurum province, in the north-eastern part of Turkey. Past landslide occurrences were identified and mapped from an interpretation of high-resolution satellite images, and earlier reports as well as by carrying out field surveys. In total, 42 landslide incidence polygons were mapped using ArcGIS 10.4.1 software and randomly split into a construction dataset 70 % (30 landslide incidences) for building the EWM, EBF, and ICM models and the remaining 30 % (12 landslides incidences) were used for verification purposes. Twelve layers of landslide-predisposing parameters were prepared, including total surface radiation, maximum relief, soil groups, standard curvature, distance to stream/river sites, distance to the road network, surface roughness, land use pattern, engineering geological rock group, topographical elevation, the orientation of slope, and terrain slope gradient. The relationships between the landslide-predisposing parameters and the landslide inventory map were determined using different statistical models (EWM, EBF, and ICM). The model results were validated with landslide incidences, which were not used during the model construction. In addition, receiver operating characteristic curves were applied, and the area under the curve (AUC) was determined for the different susceptibility maps using the success (construction data) and prediction (verification data) rate curves. The results revealed that the AUC for success rates are 0.7055, 0.7221, and 0.7368, while the prediction rates are 0.6811, 0.6997, and 0.7105 for EWM, EBF, and ICM models, respectively. Consequently, landslide susceptibility maps were classified into five susceptibility classes, including very low, low, moderate, high, and very high. Additionally, the portion of construction and verification landslides incidences in high and very high landslide susceptibility classes in each map was determined. The results showed that the EWM, EBF, and ICM models produced satisfactory accuracy. The obtained landslide susceptibility maps may be useful for future natural hazard mitigation studies and planning purposes for environmental protection.Keywords: entropy weight method, evidence belief function, information content model, landslide susceptibility mapping
Procedia PDF Downloads 1324356 A Critical Analysis of the Creation of Geoparks in Brazil: Challenges and Possibilities
Authors: Isabella Maria Beil
Abstract:
The International Geosciences and Geoparks Programme (IGGP) were officially created in 2015 by the United Nations Educational, Scientific and Cultural Organization (UNESCO) to enhance the protection of the geological heritage and fill the gaps on the World Heritage Convention. According to UNESCO, a Global Geopark is an unified area where sites and landscapes of international geological significance are managed based on a concept of sustainable development. Tourism is seen as a main activity to develop new sources of revenue. Currently (November 2022), UNESCO recognized 177 Global Geoparks, of which more than 50% are in Europe, 40% in Asia, 6% in Latin America, and the remaining 4% are distributed between Africa and Anglo-Saxon America. This picture shows the existence of a much uneven geographical distribution of these areas across the planet. Currently, there are three Geoparks in Brazil; however, the first of them was accepted by the Global Geoparks Network in 2006 and, just fifteen years later, two other Brazilian Geoparks also obtained the UNESCO title. Therefore, this paper aims to provide an overview of the current geopark situation in Brazil and to identify the main challenges faced by the implementation of these areas in the country. To this end, the Brazilian history and its main characteristics regarding the development of geoparks over the years will be briefly presented. Then, the results obtained from interviews with those responsible for each of the current 29 aspiring geoparks in Brazil will be presented. Finally, the main challenges related to the implementation of Geoparks in the country will be listed. Among these challenges, the answers obtained through the interviews revealed conflicts and problems that pose hindrances both to the start of the development of a Geopark project and to its continuity and implementation. It is clear that the task of getting multiple social actors, or stakeholders, to engage with the Geopark, one of UNESCO’s guidelines, is one of its most complex aspects. Therefore, among the main challenges, stand out the difficulty of establishing solid partnerships, what directly reflects divergences between the different social actors and their goals. This difficulty in establishing partnerships happens for a number of reasons. One of them is that the investment in a Geopark project can be high and investors often expect a short-term financial return. In addition, political support from the public sector is often costly as well, since the possible results and positive influences of a Geopark in a given area will only be experienced during future mandates. These results demonstrate that the research on Geoparks goes far beyond the geological perspective linked to its origins, and is deeply embedded in political and economic issues.Keywords: Brazil, geoparks, tourism, UNESCO
Procedia PDF Downloads 904355 Formation of Academia-Industry Collaborative Model to Improve the Quality of Teaching-Learning Process
Authors: M. Dakshayini, P. Jayarekha
Abstract:
In traditional output-based education system, class room lecture and laboratory are the traditional delivery methods used during the course. Written examination and lab examination have been used as a conventional tool for evaluating student’s performance. Hence, there are certain apprehensions that the traditional education system may not efficiently prepare the students for competent professional life. This has led for the change from Traditional output-based education to Outcome-Based Education (OBE). OBE first sets the ideal programme learning outcome consecutively on increasing degree of complexity that students are expected to master. The core curriculum, teaching methodologies and assessment tools are then designed to achieve the proposed outcomes mainly focusing on what students can actually attain after they are taught. In this paper, we discuss a promising applications based learning and evaluation component involving industry collaboration to improve the quality of teaching and student learning process. Incorporation of this component definitely improves the quality of student learning in engineering education and helps the student to attain the competency as per the graduate attributes. This may also reduce the Industry-academia gap.Keywords: outcome-based education, programme learning outcome, teaching-learning process, evaluation, industry collaboration
Procedia PDF Downloads 4494354 The Algorithm of Semi-Automatic Thai Spoonerism Words for Bi-Syllable
Authors: Nutthapat Kaewrattanapat, Wannarat Bunchongkien
Abstract:
The purposes of this research are to study and develop the algorithm of Thai spoonerism words by semi-automatic computer programs, that is to say, in part of data input, syllables are already separated and in part of spoonerism, the developed algorithm is utilized, which can establish rules and mechanisms in Thai spoonerism words for bi-syllables by utilizing analysis in elements of the syllables, namely cluster consonant, vowel, intonation mark and final consonant. From the study, it is found that bi-syllable Thai spoonerism has 1 case of spoonerism mechanism, namely transposition in value of vowel, intonation mark and consonant of both 2 syllables but keeping consonant value and cluster word (if any). From the study, the rules and mechanisms in Thai spoonerism word were applied to develop as Thai spoonerism word software, utilizing PHP program. the software was brought to conduct a performance test on software execution; it is found that the program performs bi-syllable Thai spoonerism correctly or 99% of all words used in the test and found faults on the program at 1% as the words obtained from spoonerism may not be spelling in conformity with Thai grammar and the answer in Thai spoonerism could be more than 1 answer.Keywords: algorithm, spoonerism, computational linguistics, Thai spoonerism
Procedia PDF Downloads 2364353 Interlingual Interference in Students’ Writing
Authors: Zakaria Khatraoui
Abstract:
Interlanguage has transcendentally capitalized its central role over a considerable metropolitan landscape. Either academically driven or pedagogically oriented, Interlanguage has principally floated as important than ever before. It academically probes theoretical and linguistic issues in the turf and further malleably flows from idea to reality to vindicate a bridging philosophy between theory and educational rehearsal. Characteristically, the present research grants a prolifically developed theoretical framework that is conversely sustained by empirical teaching practices, along with teasing apart the narrowly confined implementation. The focus of this interlingual study is placed stridently on syntactic errors projected in students’ writing as performance. To attain this endeavor, the paper appropriates qualitatively a plethora of focal methodological choices sponsored by a solid design. The steadily undeniable ipso facto to be examined is the creative sense of syntactic errors unequivocally endorsed by the tangible dominance of cognitively intralingual errors over linguistically interlingual ones. Subsequently, this paper attempts earnestly to highlight transferable implications worth indicating both theoretical and pedagogically professional principles. In particular, results are fundamentally relative to the scholarly community in a multidimensional sense to recommend actions of educational value.Keywords: interlanguage, interference, error, writing
Procedia PDF Downloads 744352 The Use of Actoprotectors by Professional Athletes
Authors: Kalin Ivanov, Stanislava Ivanova
Abstract:
Actoprotectors are substances with hight performance enchasing potential and hight antioxidant activity. Most of these drugs have been developed in USSR for military medicine purposes. Based on their chemical composition actoprotectors could be classified into three categories: benzimidazole derivatives (ethomersol, bemitil); adamantane derivatives (bromantane), other chemical classes. First data for intake of actoprotectors from professional athletes is from 1980. The daily intake of actoprotectors demonstrate many benefits for athletes like: positive effect on the efficiency of physical work, antihypoxic effects, antioxidant effects, nootropic effects, rapid recovery. Since 1997, bromantane is considered as doping. This is a result of Summer Olympic Games in Athlanta (1996) when several Russian athletes tested positive for bramantane. Even the drug is safe for athletes health its use is considered as violation of anti- doping rules. More than 37 years bemetil has been used by professional athletes with no risk but currently it is included in WADA monitoring programme for 2018. Current perspectives are that most used actoprotectors would be considered as doping. Many clinical studies have confirmed that intake of bemitil and bromantan demonstrate positive influence on the physical work capacity but data for other actoprotectors like chlodantane, ademol, ethomersol is limited.Keywords: actoprotector, sport, doping, bemitil
Procedia PDF Downloads 3224351 Numerical Study of Steel Structures Responses to External Explosions
Authors: Mohammad Abdallah
Abstract:
Due to the constant increase in terrorist attacks, the research and engineering communities have given significant attention to building performance under explosions. This paper presents a methodology for studying and simulating the dynamic responses of steel structures during external detonations, particularly for accurately investigating the impact of incrementing charge weight on the members total behavior, resistance and failure. Prediction damage method was introduced to evaluate the damage level of the steel members based on five scenarios of explosions. Johnson–Cook strength and failure model have been used as well as ABAQUS finite element code to simulate the explicit dynamic analysis, and antecedent field tests were used to verify the acceptance and accuracy of the proposed material strength and failure model. Based on the structural response, evaluation criteria such as deflection, vertical displacement, drift index, and damage level; the obtained results show the vulnerability of steel columns and un-braced steel frames which are designed and optimized to carry dead and live load to resist and endure blast loading.Keywords: steel structure, blast load, terrorist attacks, charge weight, damage level
Procedia PDF Downloads 3644350 Identification of EEG Attention Level Using Empirical Mode Decompositions for BCI Applications
Authors: Chia-Ju Peng, Shih-Jui Chen
Abstract:
This paper proposes a method to discriminate electroencephalogram (EEG) signals between different concentration states using empirical mode decomposition (EMD). Brain-computer interface (BCI), also called brain-machine interface, is a direct communication pathway between the brain and an external device without the inherent pathway such as the peripheral nervous system or skeletal muscles. Attention level is a common index as a control signal of BCI systems. The EEG signals acquired from people paying attention or in relaxation, respectively, are decomposed into a set of intrinsic mode functions (IMF) by EMD. Fast Fourier transform (FFT) analysis is then applied to each IMF to obtain the frequency spectrums. By observing power spectrums of IMFs, the proposed method has the better identification of EEG attention level than the original EEG signals between different concentration states. The band power of IMF3 is the most obvious especially in β wave, which corresponds to fully awake and generally alert. The signal processing method and results of this experiment paves a new way for BCI robotic system using the attention-level control strategy. The integrated signal processing method reveals appropriate information for discrimination of the attention and relaxation, contributing to a more enhanced BCI performance.Keywords: biomedical engineering, brain computer interface, electroencephalography, rehabilitation
Procedia PDF Downloads 3914349 Performance of Heifer Camels (Camelus dromedarius) on Native Range Supplemented with Different Energy Levels
Authors: Shehu, B., Muhammad, B. F., Madigawa, I. L., H. A. Alkali
Abstract:
The study was conducted to assess heifer camel behavior and live weight changes on native range supplemented with different energy levels. A total of nine camels aged between 2 and 3 years were randomly allotted into three groups and supplemented with 3400, 3600 and 3800 Kcal and designated A, B and C, respectively. The data obtained was analyzed for variance in a Completely Randomized Design. The heifers utilized average of 371.70 min/day (64% of daylight time) browsing on native pasture and 2.30 min/day (6%) sand bathing. A significantly higher mean time was spent by heifers on browsing Leptadenia hastata (P<0.001), Dichrostachys cinerea (P<0.01), Acacia nilotica (P<0.001) and Ziziphus spina-christi (P<0.05) in early dry season (January). No significant difference was recorded on browsing time on Tamarindus indica, Adansonia digitata, Piliostigma reticulatum, Parkia biglobosaand Azadirachta indica. No significant (P<0.05) liveweight change was recorded on she-camels due to the three energy levels. It was concluded that nutritive browse species in the study area could meet camel nutrient requirements including energy. Further research on effect of period on camel nutrients requirement in different physiological conditions is recommended.Keywords: heifer, camel, grazing, pasture
Procedia PDF Downloads 5434348 Effect of Masonry Infill in R.C. Framed Buildings
Authors: Pallab Das, Nabam Zomleen
Abstract:
Effective dissipation of lateral loads that are coming due to seismic force determines the strength, durability and safety concern of the structure. Masonry infill has high stiffness and strength capabilities which can be put into an effective utilization for lateral load dissipation by incorporating it into building construction, but masonry behaves in highly nonlinear manner, so it is highly important to find out generalized, yet a rational approach to determine its nonlinear behavior and failure mode and it’s response when it is incorporated into building. But most of the countries do not specify the procedure for design of masonry infill wall. Whereas, there are many analytical modeling method available in literature, e.g. equivalent diagonal strut method, finite element modeling etc. In this paper the masonry infill is modeled and 6-storey bare framed building and building with masonry infill is analyzed using SAP-200014 in order to find out inter-storey drift by time-history analysis and capacity curve by Pushover analysis. The analysis shows that, while, the structure is well within CP performance level for both the case, whereas, there is considerable reduction of inter-storey drift of about 28%, when the building is analyzed with masonry infill wall.Keywords: capacity curve, masonry infill, nonlinear analysis, time history analysis
Procedia PDF Downloads 3834347 The Influence of Wasta on Employees and Organizations in Kuwait
Authors: Abrar Al-Enzi
Abstract:
This study investigates the role of the popular utilization of Wasta within Arab societies. Wasta, by definition, is a set of personal networks based on family or kinship ties in which power and influence are utilized to get things done. As Wasta evolved, it became intensely rooted in Arab cultures, which is considered as an intrinsic tool of the culture, a method of doing business transactions and as a family obligation. However, the consequences related to Wasta in business are substantial as it impacts organizational performance, employee’s perception of the organization and the atmosphere between employees. To date, there has been little in-depth organizational research on the impact of Wasta. Hence, the question that will be addressed is: Does Wasta influence human resource management, knowledge sharing and innovation in Kuwait, which in turn affects employees’ commitment within organizations? As a result, a mixed method sequential exploratory research design will be used to examine the mentioned subject, which consists of three phases: (1) Doing some initial exploratory interviews; (2) Developing a paper-based and online survey (Quantitative method) based on the findings; (3) Lastly, following up with semi-structured interviews (Qualitative method). The rationale behind this approach is that both qualitative and quantitative methods complement each other by providing a more complete picture of the subject matter.Keywords: commitment, HRM practices, social capital, Wasta
Procedia PDF Downloads 2624346 Role of Facade in Sustainability Enhancement of Contemporary Iranian Buildings
Authors: H. Nejadriahi
Abstract:
A growing demand for sustainability makes sustainability as one of the significant debates of nowadays. Energy saving is one of the main criteria to be considered in the context of sustainability. Reducing energy use in buildings is one of the most important ways to reduce humans’ overall environmental impact. Taking this into consideration, study of different design strategies, which can assist in reducing energy use and subsequently improving the sustainability level of today's buildings would be an essential task. The sustainability level of a building is highly affected by the sustainability performance of its components. One of the main building components, which can have a great impact on energy saving and sustainability level of the building, is its facade. The aim of this study is to investigate on the role of facade in sustainability enhancement of the contemporary buildings of Iran. In this study, the concept of sustainability in architecture, the building facades, and their relationship to sustainability are explained briefly. Following that, a number of contemporary Iranian buildings are discussed and analyzed in terms of different design strategies used in their facades in accordance to the sustainability concepts. The methods used in this study are descriptive and analytic. The results of this paper would assist in generating a wider vision and a source of inspiration for the current designers to design and create environmental and sustainable buildings for the future.Keywords: building facade, contemporary buildings, Iran, sustainability
Procedia PDF Downloads 3334345 Progressive Type-I Interval Censoring with Binomial Removal-Estimation and Its Properties
Authors: Sonal Budhiraja, Biswabrata Pradhan
Abstract:
This work considers statistical inference based on progressive Type-I interval censored data with random removal. The scheme of progressive Type-I interval censoring with random removal can be described as follows. Suppose n identical items are placed on a test at time T0 = 0 under k pre-fixed inspection times at pre-specified times T1 < T2 < . . . < Tk, where Tk is the scheduled termination time of the experiment. At inspection time Ti, Ri of the remaining surviving units Si, are randomly removed from the experiment. The removal follows a binomial distribution with parameters Si and pi for i = 1, . . . , k, with pk = 1. In this censoring scheme, the number of failures in different inspection intervals and the number of randomly removed items at pre-specified inspection times are observed. Asymptotic properties of the maximum likelihood estimators (MLEs) are established under some regularity conditions. A β-content γ-level tolerance interval (TI) is determined for two parameters Weibull lifetime model using the asymptotic properties of MLEs. The minimum sample size required to achieve the desired β-content γ-level TI is determined. The performance of the MLEs and TI is studied via simulation.Keywords: asymptotic normality, consistency, regularity conditions, simulation study, tolerance interval
Procedia PDF Downloads 2504344 Automatic Moderation of Toxic Comments in the Face of Local Language Complexity in Senegal
Authors: Edouard Ngor Sarr, Abel Diatta, Serigne Mor Toure, Ousmane Sall, Lamine Faty
Abstract:
Thanks to Web 2, we are witnessing a form of democratization of the spoken word, an exponential increase in the number of users on the web, but also, and above all, the accumulation of a daily flow of content that is becoming, at times, uncontrollable. Added to this is the rise of a violent social fabric characterised by hateful and racial comments, insults, and other content that contravenes social rules and the platforms' terms of use. Consequently, managing and regulating this mass of new content is proving increasingly difficult, requiring substantial human, technical, and technological resources. Without regulation and with the complicity of anonymity, this toxic content can pollute discussions and make these online spaces highly conducive to abuse, which very often has serious consequences for certain internet users, ranging from anxiety to suicide, depression, or withdrawal. The toxicity of a comment is defined as anything that is rude, disrespectful, or likely to cause someone to leave a discussion or to take violent action against a person or a community. Two levels of measures are needed to deal with this deleterious situation. The first measures are being taken by governments through draft laws with a dual objective: (i) to punish the perpetrators of these abuses and (ii) to make online platforms accountable for the mistakes made by their users. The second measure comes from the platforms themselves. By assessing the content left by users, they can set up filters to block and/or delete content or decide to suspend the user in question for good. However, the speed of discussions and the volume of data involved mean that platforms are unable to properly monitor the moderation of content produced by Internet users. That's why they use human moderators, either through recruitment or outsourcing. Moderating comments on the web means assessing and monitoring users‘ comments on online platforms in order to strike the right balance between protection against abuse and users’ freedom of expression. It makes it possible to determine which publications and users are allowed to remain online and which are deleted or suspended, how authorised publications are displayed, and what actions accompany content deletions. In this study, we look at the problem of automatic moderation of toxic comments in the face of local African languages and, more specifically, on social network comments in Senegal. We review the state of the art, highlighting the different approaches, algorithms, and tools for moderating comments. We also study the issues and challenges of moderation in the face of web ecosystems with lesser-known languages, such as local languages.Keywords: moderation, local languages, Senegal, toxic comments
Procedia PDF Downloads 24343 Tuning for a Small Engine with a Supercharger
Authors: Shinji Kajiwara, Tadamasa Fukuoka
Abstract:
The formula project of Kinki University has been involved in the student Formula SAE of Japan (JSAE) since the second year the competition was held. The vehicle developed in the project uses a ZX-6R engine, which has been manufactured by Kawasaki Heavy Industries for the JSAE competition for the eighth time. The limited performance of the concept vehicle was improved through the development of a power train. The supercharger loading, engine dry sump, and engine cooling management of the vehicle were also enhanced. The supercharger loading enabled the vehicle to achieve a maximum output of 59.6 kW (80.6 PS)/9000 rpm and a maximum torque of 70.6 Nm (7.2 kgf m)/8000 rpm. We successfully achieved 90% of the engine’s torque band (4000–10000 rpm) with 50% of the revolutions in regular engine use (2000–12000 rpm). Using a dry sump system, we periodically managed hydraulic pressure during engine operation. A system that controls engine stoppage when hydraulic pressure falls was also constructed. Using the dry sump system at 80 mm reduced the required engine load and the vehicle’s center of gravity. Even when engine motion was suspended by the electromotive force exerted by the water pump, the circulation of cooling water was still possible. These findings enabled us to create a cooling system in accordance with the requirements of the competition.Keywords: engine, combustion, cooling system, numerical simulation, power, torque, mechanical super charger
Procedia PDF Downloads 3004342 A Comparative Study between Japan and the European Union on Software Vulnerability Public Policies
Authors: Stefano Fantin
Abstract:
The present analysis outcomes from the research undertaken in the course of the European-funded project EUNITY, which targets the gaps in research and development on cybersecurity and privacy between Europe and Japan. Under these auspices, the research presents a study on the policy approach of Japan, the EU and a number of Member States of the Union with regard to the handling and discovery of software vulnerabilities, with the aim of identifying methodological differences and similarities. This research builds upon a functional comparative analysis of both public policies and legal instruments from the identified jurisdictions. The result of this analysis is based on semi-structured interviews with EUNITY partners, as well as by the participation of the researcher to a recent report from the Center for EU Policy Study on software vulnerability. The European Union presents a rather fragmented legal framework on software vulnerabilities. The presence of a number of different legislations at the EU level (including Network and Information Security Directive, Critical Infrastructure Directive, Directive on the Attacks at Information Systems and the Proposal for a Cybersecurity Act) with no clear focus on such a subject makes it difficult for both national governments and end-users (software owners, researchers and private citizens) to gain a clear understanding of the Union’s approach. Additionally, the current data protection reform package (general data protection regulation), seems to create legal uncertainty around security research. To date, at the member states level, a few efforts towards transparent practices have been made, namely by the Netherlands, France, and Latvia. This research will explain what policy approach such countries have taken. Japan has started implementing a coordinated vulnerability disclosure policy in 2004. To date, two amendments can be registered on the framework (2014 and 2017). The framework is furthermore complemented by a series of instruments allowing researchers to disclose responsibly any new discovery. However, the policy has started to lose its efficiency due to a significant increase in reports made to the authority in charge. To conclude, the research conducted reveals two asymmetric policy approaches, time-wise and content-wise. The analysis therein will, therefore, conclude with a series of policy recommendations based on the lessons learned from both regions, towards a common approach to the security of European and Japanese markets, industries and citizens.Keywords: cybersecurity, vulnerability, European Union, Japan
Procedia PDF Downloads 1564341 A Study on the False Alarm Rates of MEWMA and MCUSUM Control Charts When the Parameters Are Estimated
Authors: Umar Farouk Abbas, Danjuma Mustapha, Hamisu Idi
Abstract:
It is now a known fact that quality is an important issue in manufacturing industries. A control chart is an integrated and powerful tool in statistical process control (SPC). The mean µ and standard deviation σ parameters are estimated. In general, the multivariate exponentially weighted moving average (MEWMA) and multivariate cumulative sum (MCUSUM) are used in the detection of small shifts in joint monitoring of several correlated variables; the charts used information from past data which makes them sensitive to small shifts. The aim of the paper is to compare the performance of Shewhart xbar, MEWMA, and MCUSUM control charts in terms of their false rates when parameters are estimated with autocorrelation. A simulation was conducted in R software to generate the average run length (ARL) values of each of the charts. After the analysis, the results show that a comparison of the false alarm rates of the charts shows that MEWMA chart has lower false alarm rates than the MCUSUM chart at various levels of parameter estimated to the number of ARL0 (in control) values. Also noticed was that the sample size has an advert effect on the false alarm of the control charts.Keywords: average run length, MCUSUM chart, MEWMA chart, false alarm rate, parameter estimation, simulation
Procedia PDF Downloads 2224340 Assessment of Human Factors Analysis and Classification System in Construction Accident Prevention
Authors: Zakari Mustapha, Clinton Aigbavboa, Wellington Didi Thwala
Abstract:
Majority of the incidents and accidents in complex high-risk systems that exist in the construction industry and other sectors have been attributed to unsafe acts of workers. The purpose of this paper was to asses Human Factors Analysis and Classification System (HFACS) in construction accident prevention. The study was conducted through the use of secondary data from journals, books and internet to achieve the objective of the study. The review of literature looked into details of different views from different scholars about HFACS framework in accidents investigations. It further highlighted on various sections or disciplines of accident occurrences in human performance within the construction. The findings from literature review showed that unsafe acts of a worker and unsafe working conditions are the two major causes of accident in the construction industry.Most significant factor in the cause of site accident in the construction industry is unsafe acts of a worker. The findings also show how the application of HFACS framework in the investigation of accident will lead to the identification of common trends. Further findings show that provision for the prevention of accident will be made based on past accident records to identify and prioritize where intervention is needed within the construction industry.Keywords: accident, construction, HFACS, unsafe acts
Procedia PDF Downloads 3214339 Introducing and Effectiveness Evaluation of Innovative Logistics System Simulation Teaching: Theoretical Integration and Verification
Authors: Tsai-Pei Liu, Zhi-Rou Zheng, Tzu-Tzu Wen
Abstract:
Innovative logistics system simulation teaching is to extract the characteristics of the system through simulation methodology. The system has randomness and interaction problems in the execution time. Therefore, the simulation model can usually deal with more complex logistics process problems, giving students different learning modes. Students have more autonomy in learning time and learning progress. System simulation has become a new educational tool, but it still needs to accept many tests to use it in the teaching field. Although many business management departments in Taiwan have started to promote, this kind of simulation system teaching is still not popular, and the prerequisite for popularization is to be supported by students. This research uses an extension of Integration Unified Theory of Acceptance and Use of Technology (UTAUT2) to explore the acceptance of students in universities of science and technology to use system simulation as a learning tool. At the same time, it is hoped that this innovation can explore the effectiveness of the logistics system simulation after the introduction of teaching. The results indicated the significant influence of performance expectancy, social influence and learning value on students’ intention towards confirmed the influence of facilitating conditions and behavioral intention. The extended UTAUT2 framework helps in understanding students’ perceived value in the innovative logistics system teaching context.Keywords: UTAUT2, logistics system simulation, learning value, Taiwan
Procedia PDF Downloads 1154338 Improving Decision Support for Organ Transplant
Authors: Ian McCulloh, Andrew Placona, Darren Stewart, Daniel Gause, Kevin Kiernan, Morgan Stuart, Christopher Zinner, Laura Cartwright
Abstract:
An estimated 22-25% of viable deceased donor kidneys are discarded every year in the US, while waitlisted candidates are dying every day. As many as 85% of transplanted organs are refused at least once for a patient that scored higher on the match list. There are hundreds of clinical variables involved in making a clinical transplant decision and there is rarely an ideal match. Decision makers exhibit an optimism bias where they may refuse an organ offer assuming a better match is imminent. We propose a semi-parametric Cox proportional hazard model, augmented by an accelerated failure time model based on patient specific suitable organ supply and demand to estimate a time-to-next-offer. Performance is assessed with Cox-Snell residuals and decision curve analysis, demonstrating improved decision support for up to a 5-year outlook. Providing clinical decision makers with quantitative evidence of likely patient outcomes (e.g., time to next offer and the mortality associated with waiting) may improve decisions and reduce optimism bias, thus reducing discarded organs and matching more patients on the waitlist.Keywords: decision science, KDPI, optimism bias, organ transplant
Procedia PDF Downloads 1054337 Preparation and Characterization of Nanostructured FeN Electrocatalyst for Air Cathode Microbial Fuel Cell (MFC)
Authors: Md. Maksudur Rahman Khan, Chee Wai Woon, Huei Ruey Ong, Vignes Rasiah, Chin Kui Cheng, Kar Min Chan, E. Baranitharan
Abstract:
The present work represents a preparation of non-precious iron-based electrocatalyst (FeN) for ORR in air-cathode microbial fuel cell by pyrolysis treatment. Iron oxalate which recovered from the industrial wastewater and Phenanthroline (Phen) were used as the iron and nitrogen precursors, respectively in preparing FeN catalyst. The performance of as prepared catalyst (FeN) was investigated in a single chambered air cathode MFC in which anaerobic sludge was used as inoculum and palm oil mill effluent as substrate. The maximum open circuit potential (OCV) and the highest power density recorded were 0.543 V and 4.9 mW/m2, respectively. Physical characterization of FeN was elucidated by using Brunauner Emmett Teller (BET), X-Ray Diffraction (XRD) analysis and Field Emission Scanning Electron Microscopy (FESEM) while the electrochemical properties were characterized by cyclic voltammetry (CV) analysis. The presence of biofilm on anode surface was examined using FESEM and confirmed using Infrared Spectroscopy and Thermogravimetric Analysis. The findings of this study demonstrated that FeN is electrochemically active and further modification is needed to increase the ORR catalytic activity.Keywords: iron based catalyst, microbial fuel cells, oxygen reduction reaction, palm oil mill effluent
Procedia PDF Downloads 3344336 An Optimal Matching Design Method of Space-Based Optical Payload for Typical Aerial Target Detection
Authors: Yin Zhang, Kai Qiao, Xiyang Zhi, Jinnan Gong, Jianming Hu
Abstract:
In order to effectively detect aerial targets over long distances, an optimal matching design method of space-based optical payload is proposed. Firstly, main factors affecting optical detectability of small targets under complex environment are analyzed based on the full link of a detection system, including band center, band width and spatial resolution. Then a performance characterization model representing the relationship between image signal-to-noise ratio (SCR) and the above influencing factors is established to describe a detection system. Finally, an optimal matching design example is demonstrated for a typical aerial target by simulating and analyzing its SCR under different scene clutter coupling with multi-scale characteristics, and the optimized detection band and spatial resolution are presented. The method can provide theoretical basis and scientific guidance for space-based detection system design, payload specification demonstration and information processing algorithm optimization.Keywords: space-based detection, aerial targets, optical system design, detectability characterization
Procedia PDF Downloads 168