Search results for: software reliability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6417

Search results for: software reliability

5067 Investigation of the Impact of Family Status and Blood Group on Individuals’ Addiction

Authors: Masoud Abbasalipour

Abstract:

In this study, the impact of family status on individuals, involving factors such as parents' literacy level, family size, individuals' blood group, and susceptibility to addiction, was investigated. Statistical tests were employed to scrutinize the relationships among these specified factors. The statistical population of the study consisted of 338 samples divided into two groups: individuals with addiction and those without addiction in the city of Amol. The addicted group was selected from individuals visiting the substance abuse treatment center in Amol, and the non-addicted group was randomly selected from individuals in urban and rural areas. The Chi-square test was used to examine the presence or absence of relationships among the variables, and Kramer's V test was employed to determine the strength of the relationship between them. Excel software facilitated the initial entry of data, and SPSS software was utilized for the desired statistical tests. The research results indicated a significant relationship between the variable of parents' education level and individuals' addiction. The analysis showed that the education level of their parents was significantly lower compared to non-addicted individuals. However, the variables of the number of family members and blood group did not significantly impact individuals' susceptibility to addiction.

Keywords: addiction, blood group, parents' literacy level, family status

Procedia PDF Downloads 58
5066 THRAP2 Gene Identified as a Candidate Susceptibility Gene of Thyroid Autoimmune Diseases Pedigree in Tunisian Population

Authors: Ghazi Chabchoub, Mouna Feki, Mohamed Abid, Hammadi Ayadi

Abstract:

Autoimmune thyroid diseases (AITDs), including Graves’ disease (GD) and Hashimoto’s thyroiditis (HT), are inherited as complex traits. Genetic factors associated with AITDs have been tentatively identified by candidate gene and genome scanning approaches. We analysed three intragenic microsatellite markers in the thyroid hormone receptor associated protein 2 gene (THRAP2), mapped near D12S79 marker, which have a potential role in immune function and inflammation [THRAP2-1(TG)n, THRAP2-2 (AC)n and THRAP2-3 (AC)n]. Our study population concerned 12 patients affected with AITDs belonging to a multiplex Tunisian family with high prevalence of AITDs. Fluorescent genotyping was carried out on ABI 3100 sequencers (Applied Biosystems USA) with the use of GENESCAN for semi-automated fragment sizing and GENOTYPER peak-calling software. Statistical analysis was performed using the non parametric Lod score (NPL) by Merlin software. Merlin outputs non-parametric NPLall (Z) and LOD scores and their corresponding asymptotic P values. The analysis for three intragenic markers in the THRAP2 gene revealed strong evidence for linkage (NPL=3.68, P=0.00012). Our results suggested the possible role of THRAP2 gene in AITDs susceptibility in this family.

Keywords: autoimmunity, autoimmune disease, genetic, linkage analysis

Procedia PDF Downloads 109
5065 Review of Currently Adopted Intelligent Programming Tutors

Authors: Rita Garcia

Abstract:

Intelligent Programming Tutors, IPTs, are supplemental educational devices that assist in teaching software development. These systems provide customized learning allowing the user to select the presentation pace, pedagogical strategy, and to recall previous and additional teaching materials reinforcing learning objectives. In addition, IPTs automatically records individual’s progress, providing feedback to the instructor and student. These tutoring systems have an advantage over Tutoring Systems because Intelligent Programming Tutors are not limited to one teaching strategy and can adjust when it detects the user struggling with a concept. The Intelligent Programming Tutor is a category of Intelligent Tutoring Systems, ITS. ITS are available for many fields in education, supporting different learning objectives and integrate into other learning tools, improving the student's learning experience. This study provides a comparison of the IPTs currently adopted by the educational community and will focus on the different teaching methodologies and programming languages. The study also includes the ability to integrate the IPT into other educational technologies, such as massive open online courses, MOOCs. The intention of this evaluation is to determine one system that would best serve in a larger ongoing research project and provide findings for other institutions looking to adopt an Intelligent Programming Tutor.

Keywords: computer education tools, integrated software development assistance, intelligent programming tutors, tutoring systems

Procedia PDF Downloads 308
5064 Investigating the Effective Physical Factors in the Development of Coastal Ecotourism in Southern Islands of Iran: A Case Study of Hendurabi Island, Iran

Authors: Zahra Khodaee

Abstract:

Background and Objective: The attractive potential for tourism in the southern islands of Iran, Kish, and Qeshm and recently Hendurabi, are becoming more and more popular and object of increased attention from the investors. The Iranian coral reef islands, with the exception of Kish and Qeshm, have not undergone sufficient development. The southern islands of Iran have faced two problems with climate change and the desire for the presence of tourists. The lack of proper planning, inefficient management, and lack of adequate knowledge of ecosystems of offshore regions have severely damaged the world natural heritage. This study was conducted to consider the correlation of tourism, development, and ecosystem because there is a need for further addressing the ecotourism in coral islands. Method: Through qualitative research, this paper was used of library studies and field studies and surveying to study the physical (objective-subjective) physical factors of ecotourism development in Honduran Island. Using SPSS software and descriptive-analytical method was shown the results. The survey was conducted with the participation of 150 tourists on Kish islands, who were chosen at random and who expressed their desire to travel to Hendurabi Island. Information was gathered using SPSS software and unique statistical T-test. The questionnaire was put together using AMOS software to ensure that the questions asked were sufficiently relevant. Findings: The results of this study presented that physical factors affecting the development of ecotourism in two categories are objective and subjective factors because IFI factor = 0.911 and CFI Factor = 0.907 into the target community. Discussion and conclusion: The results were satisfactory in that they showed that eco-tourists attached importance to see views, quiet, secluded areas, tranquility security, quality of the area being visited, easy access to services these were the top criteria for those visiting the area while they adhere to environmental compliance. Developing Management of these regions should maintain appropriate utilization along with sustainable and ecological responsibility.

Keywords: ecotourism, coral reef island, development management, Hendurabi Island

Procedia PDF Downloads 130
5063 Analysis of a CO₂ Two-Phase Ejector Performances with Taguchi and Anova Optimization

Authors: Karima Megdouli

Abstract:

The ejector, a central element within the CO₂ transcritical ejection refrigeration system, holds significant importance in enhancing refrigeration capacity and minimizing compressor power usage. This study's objective is to introduce a technique for enhancing the effectiveness of the CO₂ transcritical two-phase ejector, utilizing Taguchi and ANOVA analysis. The investigation delves into the impact of geometric parameters, secondary flow temperature, and primary flow pressure on the efficiency of the ejector. Results indicate that employing a combination of Taguchi and ANOVA offers increased reliability and superior performance when optimizing the design of the CO₂ two-phase ejector.

Keywords: ejector, supersonic, Taguchi, ANOVA, optimization

Procedia PDF Downloads 63
5062 Health Status, Perception of Self-Efficacy and Social Support of Thailand Aging

Authors: Wipakon Sonsnam, Kanya Napapongsa

Abstract:

The quantitative aim of the study; 1) health conditions, to examine the state of health of the aging, 2) perceived of self-efficacy, self-care of aging ,3) perceived of social support of the aging, 4) to examine factors associated with self-efficacy in enhancing the health and self-care when illness. 100 samples selected from communities in Dusit, Bangkok, 2014 by random sampling. The questionnaires were used to collect data have 5-point rating scale, consisting of strongly agree, agree, undecided, disagree, and strongly disagree; approved content valid by 3 experts, reliability coefficients alpha was .784 for perceived of self-efficacy, self-care of aging and .827 for perceived of social support of the aging. ST-5, 2Q used for collect mental health. The ability to engage in a daily routine was collected by Barthel ADL index. Founding, the sample group were female (68%). (33%) of them were in the age of 60-65. Most of them were married and still live with their spouse (55%) and do not work (38%). The average annual income was less than 10,000 baht supported by child. Most people think that income was adequate (49.0%) and Satisfaction (61.0%). Most of aging caring them-self, followed by them spouse (26%). Welfare of the public had supported, living for the aging (100%), followed by Join and health volunteers in communities (23%). In terms of health, (53%) of the sample group feels health was fair, hypertension was the most common health condition among sample group (68%), following by diabetes (55%). About eyesight, (42%) have visual acuity. (59.0%) do not need hearing aids. 84% have more than 20 teeth remaining, and have no problem with chewing (61%). In terms of Ability to engage in a daily routine, most of people (84%) in sample group are in type 1. (91%) of the participants don’t have bladder incontinence. For mental condition, (82%) do not have insomnia. (87%) do not have anxiety. (96%) do not have depression. However, (77%) of the sample group is facing stress. In terms of environment in home, bathroom in the home (90.0%) and floor of bathroom was slippery (91.0%). (48%) of the sample group has the skills of how to look after themselves while being sick, and how to keep up healthy lifestyle. Besides, some other factors, such as gender, age and educational background are related to the health perception. The statistical significance was <0.05. Suggestion: The instruments available to national standards such as ST-5, 2Q and Barthel ADL index. Reliability coefficients alpha was .784 for perceived of self-efficacy, self-care of aging and .827 for perceived of social support of the aging. The instrument used to collect perceived of social support must be further developed to study level of influence of social support that affect the health of elderly.

Keywords: ้health status, perception of aging, self-efficacy, social support

Procedia PDF Downloads 529
5061 Aging Evaluation of Ammonium Perchlorate/Hydroxyl Terminated Polybutadiene-Based Solid Rocket Engine by Reactive Molecular Dynamics Simulation and Thermal Analysis

Authors: R. F. B. Gonçalves, E. N. Iwama, J. A. F. F. Rocco, K. Iha

Abstract:

Propellants based on Hydroxyl Terminated Polybutadiene/Ammonium Perchlorate (HTPB/AP) are the most commonly used in most of the rocket engines used by the Brazilian Armed Forces. This work aimed at the possibility of extending its useful life (currently in 10 years) by performing kinetic-chemical analyzes of its energetic material via Differential Scanning Calorimetry (DSC) and also performing computer simulation of aging process using the software Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS). Thermal analysis via DSC was performed in triplicates and in three heating ratios (5 ºC, 10 ºC, and 15 ºC) of rocket motor with 11 years shelf-life, using the Arrhenius equation to obtain its activation energy, using Ozawa and Kissinger kinetic methods, allowing comparison with manufacturing period data (standard motor). In addition, the kinetic parameters of internal pressure of the combustion chamber in 08 rocket engines with 11 years of shelf-life were also acquired, for comparison purposes with the engine start-up data.

Keywords: shelf-life, thermal analysis, Ozawa method, Kissinger method, LAMMPS software, thrust

Procedia PDF Downloads 112
5060 Convergence Analysis of Reactive Power Based Schemes Used in Sensorless Control of Induction Motors

Authors: N. Ben Si Ali, N. Benalia, N. Zerzouri

Abstract:

Many electronic drivers for the induction motor control are based on sensorless technologies. Speed and torque control is usually attained by application of a speed or position sensor which requires the additional mounting space, reduce the reliability and increase the cost. This paper seeks to analyze dynamical performances and sensitivity to motor parameter changes of reactive power based technique used in sensorless control of induction motors. Validity of theoretical results is verified by simulation.

Keywords: adaptive observers, model reference adaptive system, RP-based estimator, sensorless control, stability analysis

Procedia PDF Downloads 537
5059 Fabrication of Antimicrobial Dental Model Using Digital Light Processing (DLP) Integrated with 3D-Bioprinting Technology

Authors: Rana Mohamed, Ahmed E. Gomaa, Gehan Safwat, Ayman Diab

Abstract:

Background: Bio-fabrication is a multidisciplinary research field that combines several principles, fabrication techniques, and protocols from different fields. The open-source-software movement is a movement that supports the use of open-source licenses for some or all software as part of the broader notion of open collaboration. Additive manufacturing is the concept of 3D printing, where it is a manufacturing method through adding layer-by-layer using computer-aided designs (CAD). There are several types of AM system used, and they can be categorized by the type of process used. One of these AM technologies is Digital light processing (DLP) which is a 3D printing technology used to rapidly cure a photopolymer resin to create hard scaffolds. DLP uses a projected light source to cure (Harden or crosslinking) the entire layer at once. Current applications of DLP are focused on dental and medical applications. Other developments have been made in this field, leading to the revolutionary field 3D bioprinting. The open-source movement was started to spread the concept of open-source software to provide software or hardware that is cheaper, reliable, and has better quality. Objective: Modification of desktop 3D printer into 3D bio-printer and the integration of DLP technology and bio-fabrication to produce an antibacterial dental model. Method: Modification of a desktop 3D printer into a 3D bioprinter. Gelatin hydrogel and sodium alginate hydrogel were prepared with different concentrations. Rhizome of Zingiber officinale, Flower buds of Syzygium aromaticum, and Bulbs of Allium sativum were extracted, and extractions were selected on different levels (Powder, aqueous extracts, total oils, and Essential oils) prepared for antibacterial bioactivity. Agar well diffusion method along with the E. coli have been used to perform the sensitivity test for the antibacterial activity of the extracts acquired by Zingiber officinale, Syzygium aromaticum, and Allium sativum. Lastly, DLP printing was performed to produce several dental models with the natural extracted combined with hydrogel to represent and simulate the Hard and Soft tissues. Result: The desktop 3D printer was modified into 3D bioprinter using open-source software Marline and modified custom-made 3D printed parts. Sodium alginate hydrogel and gelatin hydrogel were prepared at 5% (w/v), 10% (w/v), and 15%(w/v). Resin integration with the natural extracts of Rhizome of Zingiber officinale, Flower buds of Syzygium aromaticum, and Bulbs of Allium sativum was done following the percentage 1- 3% for each extract. Finally, the Antimicrobial dental model was printed; exhibits the antimicrobial activity, followed by merging with sodium alginate hydrogel. Conclusion: The open-source movement was successful in modifying and producing a low-cost Desktop 3D Bioprinter showing the potential of further enhancement in such scope. Additionally, the potential of integrating the DLP technology with bioprinting is a promising step toward the usage of the antimicrobial activity using natural products.

Keywords: 3D printing, 3D bio-printing, DLP, hydrogel, antibacterial activity, zingiber officinale, syzygium aromaticum, allium sativum, panax ginseng, dental applications

Procedia PDF Downloads 81
5058 An Efficient Strategy for Relay Selection in Multi-Hop Communication

Authors: Jung-In Baik, Seung-Jun Yu, Young-Min Ko, Hyoung-Kyu Song

Abstract:

This paper proposes an efficient relaying algorithm to obtain diversity for improving the reliability of a signal. The algorithm achieves time or space diversity gain by multiple versions of the same signal through two routes. Relays are separated between a source and destination. The routes between the source and destination are set adaptive in order to deal with different channels and noises. The routes consist of one or more relays and the source transmits its signal to the destination through the routes. The signals from the relays are combined and detected at the destination. The proposed algorithm provides a better performance than the conventional algorithms in bit error rate (BER).

Keywords: multi-hop, OFDM, relay, relaying selection

Procedia PDF Downloads 437
5057 Evaluating the Service Quality and Customers’ Satisfaction for Lihpaoland in Taiwan

Authors: Wan-Yu Liu, Tiffany April Lin, Yu-Chieh Tang, Yi-Lin Wang, Chieh-Hui Li

Abstract:

As the national income in Taiwan has been raised, the life style of the public has also been changed, so that the tourism industry gradually moves from a service industry to an experience economy. The Lihpaoland is one of the most popular theme parks in Taiwan. However, the related works on performance of service quality of the park have been lacking since its re-operation in 2012. Therefore, this study investigates the quality of software/hardware facilities and services of the Lihpaoland, and aims to achieve the following three goals: 1) analyzing how various sample data of tourists leads to different results for service quality of LihpaoLand; 2) analyzing how tourists respond to the service tangibility, service reliability, service responsiveness, service guarantee, and service empathy of LihpaoLand; 3) according to the theoretical and empirical results, proposing how to improve the overall facilities and services of LihpaoLand, and hoping to provide suggestions to the LihpaoLand or other related businesses to make decision. The survey was conducted on the tourists to the LihpaoLand using convenience sampling, and 400 questionnaires were collected successfully. Analysis results show that tourists paid much attention to maintenance of amusement facilities and safety of the park, and were satisfied with them, which are great advantages of the park. However, transportation around the LihpaoLand was inadequate, and the price of the Fullon hotel (which is the hotel closest to the LihpaoLand) were not accepted by tourists – more promotion events are recommended. Additionally, the shows are not diversified, and should be improved with the highest priority. Tourists did not pay attention to service personnel’s clothing and the ticket price, but they were not satisfied with them. Hence, this study recommends to design more distinctive costumes and conduct ticket promotions. Accordingly, the suggestions made in this study for LihpaoLand are stated as follows: 1) Diversified amusement facilities should be provided to satisfy the needs at different ages. 2) Cheep but tasty catering and more distinctive souvenirs should be offered. 3) Diversified propaganda schemes should be strengthened to increase number of tourists. 4) Quality and professional of the service staff should be enhanced to acquire public praise and tourists revisiting. 5) Ticket promotions in peak seasons, low seasons, and special events should be conducted. 6) Proper traffic flows should be planned and combined with technologies to reduce waiting time of tourists. 7) The features of theme landscape in LihpaoLand should be strengthened to increase willingness of the tourists with special preferences to visit the park. 8) Ticket discounts or premier points card promotions should be adopted to reward the tourists with high loyalty.

Keywords: service quality, customers’ satisfaction, theme park, Taiwan

Procedia PDF Downloads 457
5056 Numerical Modelling and Soil-structure Interaction Analysis of Rigid Ballast-less and Flexible Ballast-based High-speed Rail Track-embankments Using Software

Authors: Tokirhusen Iqbalbhai Shaikh, M. V. Shah

Abstract:

With an increase in travel demand and a reduction in travel time, high-speed rail (HSR) has been introduced in India. Simplified 3-D finite element modelling is necessary to predict the stability and deformation characteristics of railway embankments and soil structure interaction behaviour under high-speed design requirements for Indian soil conditions. The objective of this study is to analyse the rigid ballast-less and flexible ballast-based high speed rail track embankments for various critical conditions subjected to them, viz. static condition, moving train condition, sudden brake application, and derailment case, using software. The input parameters for the analysis are soil type, thickness of the relevant strata, unit weight, Young’s modulus, Poisson’s ratio, undrained cohesion, friction angle, dilatancy angle, modulus of subgrade reaction, design speed, and other anticipated, relevant data. Eurocode 1, IRS-004(D), IS 1343, IRS specifications, California high-speed rail technical specifications, and the NHSRCL feasibility report will be followed in this study.

Keywords: soil structure interaction, high speed rail, numerical modelling, PLAXIS3D

Procedia PDF Downloads 98
5055 Organizational Culture and Its Internalization of Change in the Manufacturing and Service Sector Industries in India

Authors: Rashmi Uchil, A. H. Sequeira

Abstract:

Post-liberalization era in India has seen an unprecedented growth of mergers, both domestic as well as cross-border deals. Indian organizations have slowly begun appreciating this inorganic method of growth. However, all is not well as is evidenced in the lowering value creation of organizations after mergers. Several studies have identified that organizational culture is one of the key factors that affects the success of mergers. But very few studies have been attempted in this realm in India. The current study attempts to identify the factors in the organizational culture variable that may be unique to India. It also focuses on the difference in the impact of organizational culture on merger of organizations in the manufacturing and service sectors in India. The study uses a mixed research approach. An exploratory research approach is adopted to identify the variables that constitute organizational culture specifically in the Indian scenario. A few hypotheses were developed from the identified variables and tested to arrive at the Grounded Theory. The Grounded Theory approach used in the study, attempts to integrate the variables related to organizational culture. Descriptive approach is used to validate the developed grounded theory with a new empirical data set and thus test the relationship between the organizational culture variables and the success of mergers. Empirical data is captured from merged organizations situated in major cities of India. These organizations represent significant proportions of the total number of organizations which have adopted mergers. The mix of industries included software, banking, manufacturing, pharmaceutical and financial services. Mixed sampling approach was adopted for this study. The first phase of sampling was conducted using the probability method of stratified random sampling. The study further used the non-probability method of judgmental sampling. Adequate sample size was identified for the study which represents the top, middle and junior management levels of the organizations that had adopted mergers. Validity and reliability of the research instrument was ensured with appropriate tests. Statistical tools like regression analysis, correlation analysis and factor analysis were used for data analysis. The results of the study revealed a strong relationship between organizational culture and its impact on the success of mergers. The study also revealed that the results were unique to the extent that they highlighted a marked difference in the manner of internalization of change of organizational culture after merger by the organizations in the manufacturing sector. Further, the study reveals that the organizations in the service sector internalized the changes at a slower rate. The study also portrays the industries in the manufacturing sector as more proactive and can contribute to a change in the perception of the said organizations.

Keywords: manufacturing industries, mergers, organizational culture, service industries

Procedia PDF Downloads 284
5054 Traditional Drawing, BIM and Erudite Design Process

Authors: Maryam Kalkatechi

Abstract:

Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.

Keywords: erudite, data-sketch, algorithm design in architecture, design process

Procedia PDF Downloads 264
5053 Study and Experimental Analysis of a Photovoltaic Pumping System under Three Operating Modes

Authors: Rekioua D., Mohammedi A., Rekioua T., Mehleb Z.

Abstract:

Photovoltaic water pumping systems is considered as one of the most promising areas in photovoltaic applications, the economy and reliability of solar electric power made it an excellent choice for remote water pumping. Two conventional techniques are currently in use; the first is the directly coupled technique and the second is the battery buffered photovoltaic pumping system. In this paper, we present different performances of a three operation modes of photovoltaic pumping system. The aim of this work is to determine the effect of different parameters influencing the photovoltaic pumping system performances, such as pumping head, System configuration and climatic conditions. The obtained results are presented and discussed.

Keywords: batteries charge mode, photovoltaic pumping system, pumping head, submersible pump

Procedia PDF Downloads 492
5052 Accurate Position Electromagnetic Sensor Using Data Acquisition System

Authors: Z. Ezzouine, A. Nakheli

Abstract:

This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.

Keywords: electromagnetic sensor, accurately, data acquisition, position measurement

Procedia PDF Downloads 277
5051 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation

Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk

Abstract:

The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.

Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set

Procedia PDF Downloads 208
5050 Hansen Solubility Parameters, Quality by Design Tool for Developing Green Nanoemulsion to Eliminate Sulfamethoxazole from Contaminated Water

Authors: Afzal Hussain, Mohammad A. Altamimi, Syed Sarim Imam, Mudassar Shahid, Osamah Abdulrahman Alnemer

Abstract:

Exhaustive application of sulfamethoxazole (SUX) became as a global threat for human health due to water contamination through diverse sources. The addressed combined application of Hansen solubility (HSPiP software) parameters and Quality by Design tool for developing various green nanoemulsions. HSPiP program assisted to screen suitable excipients based on Hansen solubility parameters and experimental solubility data. Various green nanoemulsions were prepared and characterized for globular size, size distribution, zeta potential, and removal efficiency. Design Expert (DoE) software further helped to identify critical factors responsible to have direct impact on percent removal efficiency, size, and viscosity. Morphological investigation was visualized under transmission electron microscopy (TEM). Finally, the treated was studied to negate the presence of the tested drug employing ICP-OES (inductively coupled plasma optical emission microscopy) technique and HPLC (high performance liquid chromatography). Results showed that HSPiP predicted biocompatible lipid, safe surfactant (lecithin), and propylene glycol (PG). Experimental solubility of the drug in the predicted excipients were quite convincing and vindicated. Various green nanoemulsions were fabricated, and these were evaluated for in vitro findings. Globular size (100-300 nm), PDI (0.1-0.5), zeta potential (~ 25 mV), and removal efficiency (%RE = 70-98%) were found to be in acceptable range for deciding input factors with level in DoE. Experimental design tool assisted to identify the most critical variables controlling %RE and optimized content of nanoemulsion under set constraints. Dispersion time was varied from 5-30 min. Finally, ICP-OES and HPLC techniques corroborated the absence of SUX in the treated water. Thus, the strategy is simple, economic, selective, and efficient.

Keywords: quality by design, sulfamethoxazole, green nanoemulsion, water treatment, icp-oes, hansen program (hspip software

Procedia PDF Downloads 66
5049 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University

Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat

Abstract:

Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.

Keywords: big data platforms, cloudera manager, Hadoop, MapReduce

Procedia PDF Downloads 351
5048 Relevance of Copyright and Trademark in the Gaming Industry

Authors: Deeksha Karunakar

Abstract:

The gaming industry is one of the biggest industries in the world. Video games are interactive works of authorship that require the execution of a computer programme on specialized hardware but which also incorporate a wide variety of other artistic mediums, such as music, scripts, stories, video, paintings, and characters, into which the player takes an active role. Therefore, video games are not made as singular, simple works but rather as a collection of elements that, if they reach a certain level of originality and creativity, can each be copyrighted on their own. A video game is made up of a wide variety of parts, all of which combine to form the overall sensation that we, the players, have while playing. The entirety of the components is implemented in the form of software code, which is then translated into the game's user interface. Even while copyright protection is already in place for the coding of software, the work that is produced because of that coding can also be protected by copyright. This includes the game's storyline or narrative, its characters, and even elements of the code on their own. In each sector, there is a potential legal framework required, and the gaming industry also requires legal frameworks. This represents the importance of intellectual property laws in each sector. This paper will explore the beginnings of video games, the various aspects of game copyrights, and the approach of the courts, including examples of a few different instances. Although the creative arts have always been known to draw inspiration from and build upon the works of others, it has not always been simple to evaluate whether a game has been cloned. The video game business is experiencing growth as it has never seen before today. The majority of today's video games are both pieces of software and works of audio-visual art. Even though the existing legal framework does not have a clause specifically addressing video games, it is clear that there is a great many alternative means by which this protection can be granted. This paper will represent the importance of copyright and trademark laws in the gaming industry and its regulations with the help of relevant case laws via utilizing doctrinal methodology to support its findings. The aim of the paper is to make aware of the applicability of intellectual property laws in the gaming industry and how the justice system is evolving to adapt to such new industries. Furthermore, it will provide in-depth knowledge of their relationship with each other.

Keywords: copyright, DMCA, gaming industry, trademark, WIPO

Procedia PDF Downloads 55
5047 Improvement in Oral Health-Related Quality of Life of Adult Patients After Rehabilitation With Partial Dentures: A Systematic Review and Meta-Analysis

Authors: Adama NS Bah

Abstract:

Background: Loss of teeth has a negative influence on essential oral functions such as phonetics, mastication, and aesthetics. Dentists treat people with prosthodontic rehabilitation to recover essential oral functions. The oral health quality of life inventory reflects the success of prosthodontic rehabilitation. In many countries, the current conventional care delivered to replace missing teeth for adult patients involves the provision of removable partial dentures. Aim: The aim of this systematic review and meta-analysis is to gather the best available evidence to determine patients’ oral health-related quality of life improvement after treatment with partial dentures. Methods: We searched electronic databases from January 2010 to September 2019, including PubMed, ProQuest, Science Direct, Scopus and Google Scholar. In this paper, studies were included only if the average age was 30 years and above and also published in English. Two reviewers independently screened and selected all the references based on inclusion criteria using the PRISMA guideline, and assessed the quality of the included references using the Joanna Briggs Institute quality assessment tools. Data extracted were analyzed in RevMan 5.0 software, the heterogeneity between the studies was assessed using Forest plot, I2 statistics and chi-square test with a statistical P value less than 0.05 to indicate statistical significance. Random effect models were used in case of moderate or high heterogeneity. Four studies were included in the systematic review and three studies were pooled for meta-analysis. Results: Four studies included in the systematic review and three studies included in the meta-analysis with a total of 285 patients comparing the improvement in oral health-related quality of life before and after rehabilitation with partial denture, the pooled results showed a better improvement of oral health-related quality of life after treatment with partial dentures (mean difference 5.25; 95% CI [3.81, 6.68], p < 0.00001) favoring the wearing of partial dentures. In order to ascertain the reliability of the included studies for meta-analysis risk of bias was assessed and found to be low in all included studies for meta-analysis using the Cochrane collaboration tool for risk of bias assessment. Conclusion: There is high evidence that rehabilitation with partial dentures can improve the patient’s oral health-related quality of life measured with Oral Health Impact Profile 14. This review has clinical evidence value for dentists treating the expanding vulnerable adult population.

Keywords: meta-analysis, oral health impact profile, partial dentures, systematic review

Procedia PDF Downloads 95
5046 Formex Algebra Adaptation into Parametric Design Tools: Dome Structures

Authors: Réka Sárközi, Péter Iványi, Attila B. Széll

Abstract:

The aim of this paper is to present the adaptation of the dome construction tool for formex algebra to the parametric design software Grasshopper. Formex algebra is a mathematical system, primarily used for planning structural systems such like truss-grid domes and vaults, together with the programming language Formian. The goal of the research is to allow architects to plan truss-grid structures easily with parametric design tools based on the versatile formex algebra mathematical system. To produce regular structures, coordinate system transformations are used and the dome structures are defined in spherical coordinate system. Owing to the abilities of the parametric design software, it is possible to apply further modifications on the structures and gain special forms. The paper covers the basic dome types, and also additional dome-based structures using special coordinate-system solutions based on spherical coordinate systems. It also contains additional structural possibilities like making double layer grids in all geometry forms. The adaptation of formex algebra and the parametric workflow of Grasshopper together give the possibility of quick and easy design and optimization of special truss-grid domes.

Keywords: parametric design, structural morphology, space structures, spherical coordinate system

Procedia PDF Downloads 241
5045 The Relationship between Knowledge Management Processes and Strategic Thinking at the Organization Level

Authors: Bahman Ghaderi, Hedayat Hosseini, Parviz Kafche

Abstract:

The role of knowledge management processes in achieving the strategic goals of organizations is crucial. To this end, understanding the relationship between knowledge management processes and different aspects of strategic thinking (followed by long-term organizational planning) should be considered. This research examines the relationship between each of the five knowledge management processes (creation, storage, transfer, audit, and deployment) with each dimension of strategic thinking (vision, creativity, thinking, communication and analysis) in one of the major sectors of the food industry in Iran. In this research, knowledge management and its dimensions (knowledge acquisition, knowledge storage, knowledge transfer, knowledge auditing, and finally knowledge utilization) as independent variables and strategic thinking and its dimensions (creativity, systematic thinking, vision, strategic analysis, and strategic communication) are considered as the dependent variable. The statistical population of this study consisted of 245 managers and employees of Minoo Food Industrial Group in Tehran. In this study, a simple random sampling method was used, and data were collected by a questionnaire designed by the research team. Data were analyzed using SPSS 21 software. LISERL software is also used for calculating and drawing models and graphs. Among the factors investigated in the present study, knowledge storage with 0.78 had the most effect, and knowledge transfer with 0.62 had the least effect on knowledge management and thus on strategic thinking.

Keywords: knowledge management, strategic thinking, knowledge management processes, food industry

Procedia PDF Downloads 160
5044 The Moderating Role of Test Anxiety in the Relationships Between Self-Efficacy, Engagement, and Academic Achievement in College Math Courses

Authors: Yuqing Zou, Chunrui Zou, Yichong Cao

Abstract:

Previous research has revealed relationships between self-efficacy (SE), engagement, and academic achievement among students in Western countries, but these relationships remain unknown in college math courses among college students in China. In addition, previous research has shown that test anxiety has a direct effect on engagement and academic achievement. However, how test anxiety affects the relationships between SE, engagement, and academic achievement is still unknown. In this study, the authors aimed to explore the mediating roles of behavioral engagement (BE), emotional engagement (EE), and cognitive engagement (CE) in the association between SE and academic achievement and the moderating role of test anxiety in college math courses. Our hypotheses are that the association between SE and academic achievement was mediated by engagement and that test anxiety played a moderating role in the association. To explore the research questions, the authors collected data through self-reported surveys among 147 students at a northwestern university in China. Self-reported surveys were used to collect data. The motivated strategies for learning questionnaire (MSLQ) (Pintrich, 1991), the metacognitive strategies questionnaire (Wolters, 2004), and the engagement versus disaffection with learning scale (Skinner et al., 2008) were used to assess SE, CE, and BE and EE, respectively. R software was used to analyze the data. The main analyses used were reliability and validity analysis of scales, descriptive statistics analysis of measured variables, correlation analysis, regression analysis, and structural equation modeling (SEM) analysis and moderated mediation analysis to look at the structural relationships between variables at the same time. The SEM analysis indicated that student SE was positively related to BE, EE, and CE and academic achievement. BE, EE, and CE were all positively associated with academic achievement. That is, as the authors expected, higher levels of SE led to higher levels of BE, EE, and CE, and greater academic achievement. Higher levels of BE, EE, and CE led to greater academic achievement. In addition, the moderated mediation analysis found that the path of SE to academic achievement in the model was as significant as expected, as was the moderating effect of test anxiety in the SE-Achievement association. Specifically, test anxiety was found to moderate the association between SE and BE, the association between SE and CE, and the association between EE and Achievement. The authors investigated possible mediating effects of BE, EE, and CE in the associations between SE and academic achievement, and all indirect effects were found to be significant. As for the magnitude of mediations, behavioral engagement was the most important mediator in the SE-Achievement association. This study has implications for college teachers, educators, and students in China regarding ways to promote academic achievement in college math courses, including increasing self-efficacy and engagement and lessening test anxiety toward math.

Keywords: academic engagement, self-efficacy, test anxiety, academic achievement, college math courses, behavioral engagement, cognitive engagement, emotional engagement

Procedia PDF Downloads 83
5043 The Thoughts and Feelings of 60-72 Month Old Children about School and Teacher

Authors: Ayse Ozturk Samur, Gozde Inal Kiziltepe

Abstract:

No matter what level of education it is, starting a school is an exciting process as it includes new experiences. In this process, child steps into a different environment and institution except from the family institution which he was born into and feels secure. That new environment is different from home; it is a social environment which has its own rules, and involves duties and responsibilities that should be fulfilled and new vital experiences. The children who have a positive attitude towards school and like school are more enthusiastic and eager to participate in classroom activities. Moreover, a close relationship with the teacher enables the child to have positive emotions and ideas about the teacher and school and helps children adapt to school easily. In this study, it is aimed to identify children’s perceptions of academic competence, attitudes towards school and ideas about their teachers. In accordance with the aim a mixed method that includes both qualitative and quantitative data collection methods are used. The study is supported with qualitative data after collecting quantitative data. The study group of the research consists of randomly chosen 250 children who are 60-72 month old and attending a preschool institution in a city center located West Anatolian region of Turkey. Quantitative data was collected using Feelings about School scale. The scale consists of 12 items and 4 dimensions; school, teacher, mathematic, and literacy. Reliability and validity study for the scale used in the study was conducted by the researchers with 318 children who were 60-72 months old. For content validity experts’ ideas were asked, for construct validity confirmatory factor analysis was utilized. Reliability of the scale was examined by calculating internal consistency coefficient (Cronbach alpha). At the end of the analyses it was found that FAS is a valid and reliable instrument to identify 60-72 month old children’ perception of their academic competency, attitude toward school and ideas about their teachers. For the qualitative dimension of the study, semi-structured interviews were done with 30 children aged 60-72 month. At the end of the study, it was identified that children’s’ perceptions of their academic competencies and attitudes towards school was medium-level and their ideas about their teachers were high. Based on the semi structured interviews done with children, it is identified that they have a positive perception of school and teacher. That means quantitatively gathered data is supported by qualitatively collected data.

Keywords: feelings, preschool education, school, teacher, thoughts

Procedia PDF Downloads 212
5042 Application of Systems Engineering Tools and Methods to Improve Healthcare Delivery Inside the Emergency Department of a Mid-Size Hospital

Authors: Mohamed Elshal, Hazim El-Mounayri, Omar El-Mounayri

Abstract:

Emergency department (ED) is considered as a complex system of interacting entities: patients, human resources, software and hardware systems, interfaces, and other systems. This paper represents a research for implementing a detailed Systems Engineering (SE) approach in a mid-size hospital in central Indiana. This methodology will be applied by “The Initiative for Product Lifecycle Innovation (IPLI)” institution at Indiana University to study and solve the crowding problem with the aim of increasing throughput of patients and enhance their treatment experience; therefore, the nature of crowding problem needs to be investigated with all other problems that leads to it. The presented SE methods are workflow analysis and systems modeling where SE tools such as Microsoft Visio are used to construct a group of system-level diagrams that demonstrate: patient’s workflow, documentation and communication flow, data systems, human resources workflow and requirements, leadership involved, and integration between ER different systems. Finally, the ultimate goal will be managing the process through implementation of an executable model using commercialized software tools, which will identify bottlenecks, improve documentation flow, and help make the process faster.

Keywords: systems modeling, ED operation, workflow modeling, systems analysis

Procedia PDF Downloads 169
5041 Automatic Aggregation and Embedding of Microservices for Optimized Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservices are a software development methodology in which applications are built by composing a set of independently deploy-able, small, modular services. Each service runs a unique process and it gets instantiated and deployed in one or more machines (we assume that different microservices are deployed into different machines). Microservices are becoming the de facto standard for developing distributed cloud applications due to their reduced release cycles. In principle, the responsibility of a microservice can be as simple as implementing a single function, which can lead to the following issues: - Resource fragmentation due to the virtual machine boundary. - Poor communication performance between microservices. Two composition techniques can be used to optimize resource fragmentation and communication performance: aggregation and embedding of microservices. Aggregation allows the deployment of a set of microservices on the same machine using a proxy server. Aggregation helps to reduce resource fragmentation, and is particularly useful when the aggregated services have a similar scalability behavior. Embedding deals with communication performance by deploying on the same virtual machine those microservices that require a communication channel (localhost bandwidth is reported to be about 40 times faster than cloud vendor local networks and it offers better reliability). Embedding can also reduce dependencies on load balancer services since the communication takes place on a single virtual machine. For example, assume that microservice A has two instances, a1 and a2, and it communicates with microservice B, which also has two instances, b1 and b2. One embedding can deploy a1 and b1 on machine m1, and a2 and b2 are deployed on a different machine m2. This deployment configuration allows each pair (a1-b1), (a2-b2) to communicate using the localhost interface without the need of a load balancer between microservices A and B. Aggregation and embedding techniques are complex since different microservices might have incompatible runtime dependencies which forbid them from being installed on the same machine. There is also a security concern since the attack surface between microservices can be larger. Luckily, container technology allows to run several processes on the same machine in an isolated manner, solving the incompatibility of running dependencies and the previous security concern, thus greatly simplifying aggregation/embedding implementations by just deploying a microservice container on the same machine as the aggregated/embedded microservice container. Therefore, a wide variety of deployment configurations can be described by combining aggregation and embedding to create an efficient and robust microservice architecture. This paper presents a formal method that receives a declarative definition of a microservice architecture and proposes different optimized deployment configurations by aggregating/embedding microservices. The first prototype is based on i2kit, a deployment tool also submitted to ICWS 2018. The proposed prototype optimizes the following parameters: network/system performance, resource usage, resource costs and failure tolerance.

Keywords: aggregation, deployment, embedding, resource allocation

Procedia PDF Downloads 190
5040 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity

Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish

Abstract:

Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.

Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow

Procedia PDF Downloads 122
5039 Characterizing the Rectification Process for Designing Scoliosis Braces: Towards Digital Brace Design

Authors: Inigo Sanz-Pena, Shanika Arachchi, Dilani Dhammika, Sanjaya Mallikarachchi, Jeewantha S. Bandula, Alison H. McGregor, Nicolas Newell

Abstract:

The use of orthotic braces for adolescent idiopathic scoliosis (AIS) patients is the most common non-surgical treatment to prevent deformity progression. The traditional method to create an orthotic brace involves casting the patient’s torso to obtain a representative geometry, which is then rectified by an orthotist to the desired geometry of the brace. Recent improvements in 3D scanning technologies, rectification software, CNC, and additive manufacturing processes have given the possibility to compliment, or in some cases, replace manual methods with digital approaches. However, the rectification process remains dependent on the orthotist’s skills. Therefore, the rectification process needs to be carefully characterized to ensure that braces designed through a digital workflow are as efficient as those created using a manual process. The aim of this study is to compare 3D scans of patients with AIS against 3D scans of both pre- and post-rectified casts that have been manually shaped by an orthotist. Six AIS patients were recruited from the Ragama Rehabilitation Clinic, Colombo, Sri Lanka. All patients were between 10 and 15 years old, were skeletally immature (Risser grade 0-3), and had Cobb angles between 20-45°. Seven spherical markers were placed at key anatomical locations on each patient’s torso and on the pre- and post-rectified molds so that distances could be reliably measured. 3D scans were obtained of 1) the patient’s torso and pelvis, 2) the patient’s pre-rectification plaster mold, and 3) the patient’s post-rectification plaster mold using a Structure Sensor Mark II 3D scanner (Occipital Inc., USA). 3D stick body models were created for each scan to represent the distances between anatomical landmarks. The 3D stick models were used to analyze the changes in position and orientation of the anatomical landmarks between scans using Blender open-source software. 3D Surface deviation maps represented volume differences between the scans using CloudCompare open-source software. The 3D stick body models showed changes in the position and orientation of thorax anatomical landmarks between the patient and the post-rectification scans for all patients. Anatomical landmark position and volume differences were seen between 3D scans of the patient’s torsos and the pre-rectified molds. Between the pre- and post-rectified molds, material removal was consistently seen on the anterior side of the thorax and the lateral areas below the ribcage. Volume differences were seen in areas where the orthotist planned to place pressure pads (usually at the trochanter on the side to which the lumbar curve was tilted (trochanter pad), at the lumbar apical vertebra (lumbar pad), on the rib connected to the apical vertebrae at the mid-axillary line (thoracic pad), and on the ribs corresponding to the upper thoracic vertebra (axillary extension pad)). The rectification process requires the skill and experience of an orthotist; however, this study demonstrates that the brace shape, location, and volume of material removed from the pre-rectification mold can be characterized and quantified. Results from this study can be fed into software that can accelerate the brace design process and make steps towards the automated digital rectification process.

Keywords: additive manufacturing, orthotics, scoliosis brace design, sculpting software, spinal deformity

Procedia PDF Downloads 134
5038 Detailed Analysis of Multi-Mode Optical Fiber Infrastructures for Data Centers

Authors: Matej Komanec, Jan Bohata, Stanislav Zvanovec, Tomas Nemecek, Jan Broucek, Josef Beran

Abstract:

With the exponential growth of social networks, video streaming and increasing demands on data rates, the number of newly built data centers rises proportionately. The data centers, however, have to adjust to the rapidly increased amount of data that has to be processed. For this purpose, multi-mode (MM) fiber based infrastructures are often employed. It stems from the fact, the connections in data centers are typically realized within a short distance, and the application of MM fibers and components considerably reduces costs. On the other hand, the usage of MM components brings specific requirements for installation service conditions. Moreover, it has to be taken into account that MM fiber components have a higher production tolerance for parameters like core and cladding diameters, eccentricity, etc. Due to the high demands for the reliability of data center components, the determination of properly excited optical field inside the MM fiber core belongs to the key parameters while designing such an MM optical system architecture. Appropriately excited mode field of the MM fiber provides optimal power budget in connections, leads to the decrease of insertion losses (IL) and achieves effective modal bandwidth (EMB). The main parameter, in this case, is the encircled flux (EF), which should be properly defined for variable optical sources and consequent different mode-field distribution. In this paper, we present detailed investigation and measurements of the mode field distribution for short MM links purposed in particular for data centers with the emphasis on reliability and safety. These measurements are essential for large MM network design. The various scenarios, containing different fibers and connectors, were tested in terms of IL and mode-field distribution to reveal potential challenges. Furthermore, we focused on estimation of particular defects and errors, which can realistically occur like eccentricity, connector shifting or dust, were simulated and measured, and their dependence to EF statistics and functionality of data center infrastructure was evaluated. The experimental tests were performed at two wavelengths, commonly used in MM networks, of 850 nm and 1310 nm to verify EF statistics. Finally, we provide recommendations for data center systems and networks, using OM3 and OM4 MM fiber connections.

Keywords: optical fiber, multi-mode, data centers, encircled flux

Procedia PDF Downloads 366