Search results for: open source software.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11437

Search results for: open source software.

10627 Genetic Association of SIX6 Gene with Pathogenesis of Glaucoma

Authors: Riffat Iqbal, Sidra Ihsan, Andleeb Batool, Maryam Mukhtar

Abstract:

Glaucoma is a gathering of optic neuropathies described by dynamic degeneration of retinal ganglionic cells. It is clinically and innately heterogenous illness containing a couple of particular forms each with various causes and severities. Primary open-angle glaucoma (POAG) is the most generally perceived kind of glaucoma. This study investigated the genetic association of single nucleotide polymorphisms (SNPs; rs10483727 and rs33912345) at the SIX1/SIX6 locus with primary open-angle glaucoma (POAG) in the Pakistani population. The SIX6 gene plays an important role in ocular development and has been associated with morphology of the optic nerve. A total of 100 patients clinically diagnosed with glaucoma and 100 control individuals of age over 40 were enrolled in the study. Genomic DNA was extracted by organic extraction method. The SNP genotyping was done by (i) PCR based restriction fragment length polymorphism (RFLP) and sequencing method. Significant genetic associations were observed for rs10483727 (risk allele T) and rs33912345 (risk allele C) with POAG. Hence, it was concluded that Six6 gene is genetically associated with pathogenesis of Glaucoma in Pakistan.

Keywords: genotyping, Pakistani population, primary open-angle glaucoma, SIX6 gene

Procedia PDF Downloads 181
10626 Research on Architectural Steel Structure Design Based on BIM

Authors: Tianyu Gao

Abstract:

Digital architectures use computer-aided design, programming, simulation, and imaging to create virtual forms and physical structures. Today's customers want to know more about their buildings. They want an automatic thermostat to learn their behavior and contact them, such as the doors and windows they want to open with a mobile app. Therefore, the architectural display form is more closely related to the customer's experience. Based on the purpose of building informationization, this paper studies the steel structure design based on BIM. Taking the Zigan office building in Hangzhou as an example, it is divided into four parts, namely, the digital design modulus of the steel structure, the node analysis of the steel structure, the digital production and construction of the steel structure. Through the application of BIM software, the architectural design can be synergized, and the building components can be informationized. Not only can the architectural design be feedback in the early stage, but also the stability of the construction can be guaranteed. In this way, the monitoring of the entire life cycle of the building and the meeting of customer needs can be realized.

Keywords: digital architectures, BIM, steel structure, architectural design

Procedia PDF Downloads 191
10625 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework

Procedia PDF Downloads 281
10624 [Keynote Talk]: Knowledge Codification and Innovation Success within Digital Platforms

Authors: Wissal Ben Arfi, Lubica Hikkerova, Jean-Michel Sahut

Abstract:

This study examines interfirm networks in the digital transformation era, and in particular, how tacit knowledge codification affects innovation success within digital platforms. Hence, one of the most important features of digital transformation and innovation process outcomes is the emergence of digital platforms, as an interfirm network, at the heart of open innovation. This research aims to illuminate how digital platforms influence inter-organizational innovation through virtual team interactions and knowledge sharing practices within an interfirm network. Consequently, it contributes to the respective strategic management literature on new product development (NPD), open innovation, industrial management, and its emerging interfirm networks’ management. The empirical findings show, on the one hand, that knowledge conversion may be enhanced, especially by the socialization which seems to be the most important phase as it has played a crucial role to hold the virtual team members together. On the other hand, in the process of socialization, the tacit knowledge codification is crucial because it provides the structure needed for the interfirm network actors to interact and act to reach common goals which favor the emergence of open innovation. Finally, our results offer several conditions necessary, but not always sufficient, for interfirm managers involved in NPD and innovation concerning strategies to increasingly shape interconnected and borderless markets and business collaborations. In the digital transformation era, the need for adaptive and innovative business models as well as new and flexible network forms is becoming more significant than ever. Supported by technological advancements and digital platforms, companies could benefit from increased market opportunities and creating new markets for their innovations through alliances and collaborative strategies, as a mode of reducing or eliminating uncertainty environments or entry barriers. Consequently, an efficient and well-structured interfirm network is essential to create network capabilities, to ensure tacit knowledge sharing, to enhance organizational learning and to foster open innovation success within digital platforms.

Keywords: interfirm networks, digital platform, virtual teams, open innovation, knowledge sharing

Procedia PDF Downloads 123
10623 A PHREEQC Reactive Transport Simulation for Simply Determining Scaling during Desalination

Authors: Andrew Freiburger, Sergi Molins

Abstract:

Freshwater is a vital resource; yet, the supply of clean freshwater is diminishing as the consequence of melting snow and ice from global warming, pollution from industry, and an increasing demand from human population growth. The unsustainable trajectory of diminishing water resources is projected to jeopardize water security for billions of people in the 21st century. Membrane desalination technologies may resolve the growing discrepancy between supply and demand by filtering arbitrary feed water into a fraction of renewable, clean water and a fraction of highly concentrated brine. The leading hindrance of membrane desalination is fouling, whereby the highly concentrated brine solution encourages micro-organismal colonization and/or the precipitation of occlusive minerals (i.e. scale) upon the membrane surface. Thus, an understanding of brine formation is necessary to mitigate membrane fouling and to develop efficacious desalination technologies that can bolster the supply of available freshwater. This study presents a reactive transport simulation of brine formation and scale deposition during reverse osmosis (RO) desalination. The simulation conceptually represents the RO module as a one-dimensional domain, where feed water directionally enters the domain with a prescribed fluid velocity and is iteratively concentrated in the immobile layer of a dual porosity model. Geochemical PHREEQC code numerically evaluated the conceptual model with parameters for the BW30-400 RO module and for real water feed sources – e.g. the Red and Mediterranean seas, and produced waters from American oil-wells, based upon peer-review data. The presented simulation is computationally simpler, and hence less resource intensive, than the existent and more rigorous simulations of desalination phenomena, like TOUGHREACT. The end-user may readily prepare input files and execute simulations on a personal computer with open source software. The graphical results of fouling-potential and brine characteristics may therefore be particularly useful as the initial tool for screening candidate feed water sources and/or informing the selection of an RO module.

Keywords: desalination, PHREEQC, reactive transport, scaling

Procedia PDF Downloads 130
10622 Evaluation of Sensor Pattern Noise Estimators for Source Camera Identification

Authors: Benjamin Anderson-Sackaney, Amr Abdel-Dayem

Abstract:

This paper presents a comprehensive survey of recent source camera identification (SCI) systems. Then, the performance of various sensor pattern noise (SPN) estimators was experimentally assessed, under common photo response non-uniformity (PRNU) frameworks. The experiments used 1350 natural and 900 flat-field images, captured by 18 individual cameras. 12 different experiments, grouped into three sets, were conducted. The results were analyzed using the receiver operator characteristic (ROC) curves. The experimental results demonstrated that combining the basic SPN estimator with a wavelet-based filtering scheme provides promising results. However, the phase SPN estimator fits better with both patch-based (BM3D) and anisotropic diffusion (AD) filtering schemes.

Keywords: sensor pattern noise, source camera identification, photo response non-uniformity, anisotropic diffusion, peak to correlation energy ratio

Procedia PDF Downloads 435
10621 Predicting National Football League (NFL) Match with Score-Based System

Authors: Marcho Setiawan Handok, Samuel S. Lemma, Abdoulaye Fofana, Naseef Mansoor

Abstract:

This paper is proposing a method to predict the outcome of the National Football League match with data from 2019 to 2022 and compare it with other popular models. The model uses open-source statistical data of each team, such as passing yards, rushing yards, fumbles lost, and scoring. Each statistical data has offensive and defensive. For instance, a data set of anticipated values for a specific matchup is created by comparing the offensive passing yards obtained by one team to the defensive passing yards given by the opposition. We evaluated the model’s performance by contrasting its result with those of established prediction algorithms. This research is using a neural network to predict the score of a National Football League match and then predict the winner of the game.

Keywords: game prediction, NFL, football, artificial neural network

Procedia PDF Downloads 79
10620 Motor Controller Implementation Using Model Based Design

Authors: Cau Tran, Tu Nguyen, Tien Pham

Abstract:

Model-based design (MBD) is a mathematical and visual technique for addressing design issues in the fields of communications, signal processing, and complicated control systems. It is utilized in several automotive, aerospace, industrial, and motion control applications. Virtual models are at the center of the software development process with model based design. A method used in the creation of embedded software is model-based design. In this study, the LAT motor is modeled in a simulation environment, and the LAT motor control is designed with a cascade structure, a speed and current control loop, and a controller that is used in the next part. A PID structure serves as this controller. Based on techniques and motor parameters that match the design goals, the PID controller is created for the model using traditional design principles. The MBD approach will be used to build embedded software for motor control. The paper will be divided into three distinct sections. The first section will introduce the design process and the benefits and drawbacks of the MBD technique. The design of control software for LAT motors will be the main topic of the next section. The experiment's results are the subject of the last section.

Keywords: model based design, limited angle torque, intellectual property core, hardware description language, controller area network, user datagram protocol

Procedia PDF Downloads 92
10619 Sardine Oil as a Source of Lipid in the Diet of Giant Freshwater Prawn (Macrobrachium rosenbergii)

Authors: A. T. Ramachandra Naik, H. Shivananda Murthy, H. n. Anjanayappa

Abstract:

The freshwater prawn, Macrobrachium rosenbergii is a more popular crustacean cultured widely in monoculture system in India. It has got high nutritional value in the human diet. Hence, understanding its enzymatic and body composition is important in order to judge its flesh quality. Fish oil specially derived from Indian oil sardine is a good source of highly unsaturated fatty acid and lipid source in fish/prawn diet. A 35% crude protein diet with graded levels of Sardine oil as a source of fat was incorporated at four levels viz, 2.07, 4.07, 6.07 and 8.07% maintaining a total lipid level of feed at 8.11, 10.24, 12.28 and 14.33% respectively. Diet without sardine oil (6.05% total lipid) was served as basal treatment. The giant freshwater prawn, Macrobrachium rosenbergii was used as test animal and the experiment was lost for 112 days. Significantly, higher gain in weight of prawn was recorded in the treatment with 6.07% sardine oil incorporation followed by higher specific growth rate, food conversion rate and protein efficiency ratio. The 8.07% sardine oil diet produced the highest RNA: DNA ratio in the prawn muscle. Digestive enzyme analyses in the digestive tract and mid-gut gland showed the greatest activity in prawns fed the 8.07% diet.

Keywords: digestive enzyme, fish diet, Macrobrachium rosenbergii, sardine oil

Procedia PDF Downloads 322
10618 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories

Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos

Abstract:

Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.

Keywords: database, forensic genetics, genetic analysis, sample management, software solution

Procedia PDF Downloads 367
10617 Reconstruction Spectral Reflectance Cube Based on Artificial Neural Network for Multispectral Imaging System

Authors: Iwan Cony Setiadi, Aulia M. T. Nasution

Abstract:

The multispectral imaging (MSI) technique has been used for skin analysis, especially for distant mapping of in-vivo skin chromophores by analyzing spectral data at each reflected image pixel. For ergonomic purpose, our multispectral imaging system is decomposed in two parts: a light source compartment based on LED with 11 different wavelenghts and a monochromatic 8-Bit CCD camera with C-Mount Objective Lens. The software based on GUI MATLAB to control the system was also developed. Our system provides 11 monoband images and is coupled with a software reconstructing hyperspectral cubes from these multispectral images. In this paper, we proposed a new method to build a hyperspectral reflectance cube based on artificial neural network algorithm. After preliminary corrections, a neural network is trained using the 32 natural color from X-Rite Color Checker Passport. The learning procedure involves acquisition, by a spectrophotometer. This neural network is then used to retrieve a megapixel multispectral cube between 380 and 880 nm with a 5 nm resolution from a low-spectral-resolution multispectral acquisition. As hyperspectral cubes contain spectra for each pixel; comparison should be done between the theoretical values from the spectrophotometer and the reconstructed spectrum. To evaluate the performance of reconstruction, we used the Goodness of Fit Coefficient (GFC) and Root Mean Squared Error (RMSE). To validate reconstruction, the set of 8 colour patches reconstructed by our MSI system and the one recorded by the spectrophotometer were compared. The average GFC was 0.9990 (standard deviation = 0.0010) and the average RMSE is 0.2167 (standard deviation = 0.064).

Keywords: multispectral imaging, reflectance cube, spectral reconstruction, artificial neural network

Procedia PDF Downloads 318
10616 Requirement Engineering for Intrusion Detection Systems in Wireless Sensor Networks

Authors: Afnan Al-Romi, Iman Al-Momani

Abstract:

The urge of applying the Software Engineering (SE) processes is both of vital importance and a key feature in critical, complex large-scale systems, for example, safety systems, security service systems, and network systems. Inevitably, associated with this are risks, such as system vulnerabilities and security threats. The probability of those risks increases in unsecured environments, such as wireless networks in general and in Wireless Sensor Networks (WSNs) in particular. WSN is a self-organizing network of sensor nodes connected by wireless links. WSNs consist of hundreds to thousands of low-power, low-cost, multi-function sensor nodes that are small in size and communicate over short-ranges. The distribution of sensor nodes in an open environment that could be unattended in addition to the resource constraints in terms of processing, storage and power, make such networks in stringent limitations such as lifetime (i.e. period of operation) and security. The importance of WSN applications that could be found in many militaries and civilian aspects has drawn the attention of many researchers to consider its security. To address this important issue and overcome one of the main challenges of WSNs, security solution systems have been developed by researchers. Those solutions are software-based network Intrusion Detection Systems (IDSs). However, it has been witnessed, that those developed IDSs are neither secure enough nor accurate to detect all malicious behaviours of attacks. Thus, the problem is the lack of coverage of all malicious behaviours in proposed IDSs, leading to unpleasant results, such as delays in the detection process, low detection accuracy, or even worse, leading to detection failure, as illustrated in the previous studies. Also, another problem is energy consumption in WSNs caused by IDS. So, in other words, not all requirements are implemented then traced. Moreover, neither all requirements are identified nor satisfied, as for some requirements have been compromised. The drawbacks in the current IDS are due to not following structured software development processes by researches and developers when developing IDS. Consequently, they resulted in inadequate requirement management, process, validation, and verification of requirements quality. Unfortunately, WSN and SE research communities have been mostly impermeable to each other. Integrating SE and WSNs is a real subject that will be expanded as technology evolves and spreads in industrial applications. Therefore, this paper will study the importance of Requirement Engineering when developing IDSs. Also, it will study a set of existed IDSs and illustrate the absence of Requirement Engineering and its effect. Then conclusions are drawn in regard of applying requirement engineering to systems to deliver the required functionalities, with respect to operational constraints, within an acceptable level of performance, accuracy and reliability.

Keywords: software engineering, requirement engineering, Intrusion Detection System, IDS, Wireless Sensor Networks, WSN

Procedia PDF Downloads 320
10615 Triose Phosphate Utilisation at the (Sub)Foliar Scale Is Modulated by Whole-plant Source-sink Ratios and Nitrogen Budgets in Rice

Authors: Zhenxiang Zhou

Abstract:

The triose phosphate utilisation (TPU) limitation to leaf photosynthesis is a biochemical process concerning the sub-foliar carbon sink-source (im)balance, in which photorespiration-associated amino acids exports provide an additional outlet for carbon and increases leaf photosynthetic rate. However, whether this process is regulated by whole-plant sink-source relations and nitrogen budgets remains unclear. We address this question by model analyses of gas-exchange data measured on leaves at three growth stages of rice plants grown at two-nitrogen levels, where three means (leaf-colour modification, adaxial vs abaxial measurements, and panicle pruning) were explored to alter source-sink ratios. Higher specific leaf nitrogen (SLN) resulted in higher rates of TPU and also led to the TPU limitation occurring at a lower intercellular CO2 concentration. Photorespiratory nitrogen assimilation was greater in higher-nitrogen leaves but became smaller in cases associated with yellower-leaf modification, abaxial measurement, or panicle pruning. The feedback inhibition of panicle pruning on rates of TPU was not always observed because panicle pruning blocked nitrogen remobilisation from leaves to grains, and the increased SLN masked the feedback inhibition. The (sub)foliar TPU limitation can be modulated by whole-plant source-sink ratios and nitrogen budgets during rice grain filling, suggesting a close link between sub-foliar and whole-plant sink limitations.

Keywords: triose phosphate utilization, sink limitation, panicle pruning, oryza sativa

Procedia PDF Downloads 88
10614 Design and Optimization of Soil Nailing Construction

Authors: Fereshteh Akbari, Farrokh Jalali Mosalam, Ali Hedayatifar, Amirreza Aminjavaheri

Abstract:

The soil nailing is an effective method to stabilize slopes and retaining structures. Consequently, the lateral and vertical displacement of retaining walls are important criteria to evaluate the safety risks of adjacent structures. This paper is devoted to the optimization problems of retaining walls based on ABAQOUS Software. The various parameters such as nail length, orientation, arrangement, horizontal spacing, and bond skin friction, on lateral and vertical displacement of retaining walls are investigated. In order to ensure accuracy, the mobilized shear stress acting around the perimeter of the nail-soil interface is also modeled in ABAQOUS software. The observed trend of results is compared to the previous researches.

Keywords: retaining walls, soil nailing, ABAQOUS software, lateral displacement, vertical displacement

Procedia PDF Downloads 124
10613 Filling the Gaps with Representation: Netflix’s Anne with an E as a Way to Reveal What the Text Hid

Authors: Arkadiusz Adam Gardaś

Abstract:

In his theory of gaps, Wolfgang Iser states that literary texts often lack direct messages. Instead of using straightforward descriptions, authors leave the gaps or blanks, i.e., the spaces within the text that come into existence only when readers fill them with their understanding and experiences. This paper’s aim is to present Iser’s literary theory in an intersectional way by comparing it to the idea of intersemiotic translation. To be more precise, the author uses the example of Netflix’s adaption of Lucy Maud Montgomery’s Anne of Green Gables as a form of rendering a book into a film in such a way that certain textual gaps are filled with film images. Intersemiotic translation is a rendition in which signs of one kind of media are translated into the signs of the other media. Film adaptions are the most common, but not the only, type of intersemiotic translation. In this case, the role of the translator is taken by a screenwriter. A screenwriter’s role can reach beyond the direct meaning presented by the author, and instead, it can delve into the source material (here – a novel) in a deeper way. When it happens, a screenwriter is able to spot the gaps in the text and fill them with images that can later be presented to the viewers. Anne with an E, the Netflix adaption of Montgomery’s novel, may be used as a highly meaningful example of such a rendition. It is due to the fact that the 2017 series was broadcasted more than a hundred years after the first edition of the novel was published. This means that what the author might not have been able to show in her text can now be presented in a more open way. The screenwriter decided to use this opportunity to represent certain groups in the film, i.e., racial and sexual minorities, and women. Nonetheless, the series does not alter the novel; in fact, it adds to it by filling the blanks with more direct images. In the paper, fragments of the first season of Anne with an E are analysed in comparison to its source, the novel by Montgomery. The main purpose of that is to show how intersemiotic translation connected with the Iser’s literary theory can enrich the understanding of works of art, culture, media, and literature.

Keywords: intersemiotic translation, film, literary gaps, representation

Procedia PDF Downloads 311
10612 Mobile Phone Text Reminders and Voice Call Follow-ups Improve Attendance for Community Retail Pharmacy Refills; Learnings from Lango Sub-region in Northern Uganda

Authors: Jonathan Ogwal, Louis H. Kamulegeya, John M. Bwanika, Davis Musinguzi

Abstract:

Introduction: Community retail Pharmacy drug distribution points (CRPDDP) were implemented in the Lango sub-region as part of the Ministry of Health’s response to improving access and adherence to antiretroviral treatment (ART). Clients received their ART refills from nearby local pharmacies; as such, the need for continuous engagement through mobile phone appointment reminders and health messages. We share learnings from the implementation of mobile text reminders and voice call follow-ups among ART clients attending the CRPDDP program in northern Uganda. Methods: A retrospective data review of electronic medical records from four pharmacies allocated for CRPDDP in the Lira and Apac districts of the Lango sub-region in Northern Uganda was done from February to August 2022. The process involved collecting phone contacts of eligible clients from the health facility appointment register and uploading them onto a messaging platform customized by Rapid-pro, an open-source software. Client information, including code name, phone number, next appointment date, and the allocated pharmacy for ART refill, was collected and kept confidential. Contacts received appointment reminder messages and other messages on positive living as an ART client. Routine voice call follow-ups were done to ascertain the picking of ART from the refill pharmacy. Findings: In total, 1,354 clients were reached from the four allocated pharmacies found in urban centers. 972 clients received short message service (SMS) appointment reminders, and 382 were followed up through voice calls. The majority (75%) of the clients returned for refills on the appointed date, 20% returned within four days after the appointment date, and the remaining 5% needed follow-up where they reported that they were not in the district by the appointment date due to other engagements. Conclusion: The use of mobile text reminders and voice call follow-ups improves the attendance of community retail pharmacy refills.

Keywords: antiretroviral treatment, community retail drug distribution points, mobile text reminders, voice call follow-up

Procedia PDF Downloads 96
10611 Reproductive Performance of Dairy Cows at Different Parities: A Case Study in Enrekang Regency, Indonesia

Authors: Muhammad Yusuf, Abdul Latief Toleng, Djoni Prawira Rahardja, Ambo Ako, Sahiruddin Sahiruddin, Abdi Eriansyah

Abstract:

The objective of this study was to know the reproductive performance of dairy cows at different parities. A total of 60 dairy Holstein-Friesian cows with parity one to three from five small farms raised by the farmers were used in the study. All cows were confined in tie stall barn with rubber on the concrete floor. The herds were visited twice for survey with the help of a questionnaire. Reproductive parameters used in the study were days open, calving interval, and service per conception (S/C). The results of this study showed that the mean (±SD) days open of the cows in parity 2 was slightly longer than those in parity 3 (228.2±121.5 vs. 205.5±144.5; P=0.061). None cows conceived within 85 days postpartum in parity 3 in comparison to 13.8% cows conceived in parity 2. However, total cows conceived within 150 days post partum in parity 2 and parity 3 were 30.1% and 36.4%, respectively. Likewise, after reaching 210 days after calving, number of cows conceived in parity 3 had higher than number of cows in parity 2 (72.8% vs. 44.8%; P<0.05). The mean (±SD) calving interval of the cows in parity 2 and parity 3 were 508.2±121.5 and 495.5±144.1, respectively. Number of cows with calving interval of 400 and 450 days in parity 3 was higher than those cows in parity 2 (23.1% vs. 17.2% and 53.9% vs. 31.0%). Cows in parity 1 had significantly (P<0.01) lower number of S/C in comparison to the cows with parity 2 and parity 3 (1.6±1.2 vs. 3.5±3.4 and 3.3±2.1). It can be concluded that reproductive performance of the cows is affected by different parities.

Keywords: dairy cows, parity, days open, calving interval, service per conception

Procedia PDF Downloads 254
10610 Innovating and Disrupting Higher Education: The Evolution of Massive Open Online Courses

Authors: Nabil Sultan

Abstract:

A great deal has been written on Massive Open Online Courses (MOOCs) since 2012 (considered by some as the year of the MOOCs). The emergence of MOOCs caused a great deal of interest amongst academics and technology experts as well as ordinary people. Some of the authors who wrote on MOOCs perceived it as the next big thing that will disrupt education. Other authors saw it as another fad that will go away once it ran its course (as most fads often do). But MOOCs did not turn out to be a fad and it is still around. Most importantly, they evolved into something that is beginning to look like a viable business model. This paper explores this phenomenon within the theoretical frameworks of disruptive innovations and jobs to be done as developed by Clayton Christensen and his colleagues and its implications for the future of higher education (HE).

Keywords: MOOCs, disruptive innovations, higher education, jobs theory

Procedia PDF Downloads 267
10609 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 485
10608 CNC Milling-Drilling Machine Cutting Tool Holder

Authors: Hasan Al Dabbas

Abstract:

In this paper, it is addressed that the mechanical machinery captures a major share of innovation in drilling and milling chucks technology. Users demand higher speeds in milling because they are cutting more aluminum and are relying on higher speeds to eliminate secondary finishing operations. To meet that demand, milling-machine builders have enhanced their machine’s rigidity. Moreover, faster cutting has caught up with boring mills. Cooling these machine’s internal components is a challenge at high speeds. Another trend predicted that it is more use of controlled axes to let the machines do many more operations on 5 sides without having to move or re-fix the work. Advances of technology in mechanical engineering have helped to make high-speed machining equipment. To accompany these changes in milling and drilling machines chucks, the demand of easiest software is increased. An open architecture controller is being sought that would allow flexibility and information exchange.

Keywords: drilling, milling, chucks, cutting edges, tools, machines

Procedia PDF Downloads 570
10607 Facial Recognition of University Entrance Exam Candidates using FaceMatch Software in Iran

Authors: Mahshid Arabi

Abstract:

In recent years, remarkable advancements in the fields of artificial intelligence and machine learning have led to the development of facial recognition technologies. These technologies are now employed in a wide range of applications, including security, surveillance, healthcare, and education. In the field of education, the identification of university entrance exam candidates has been one of the fundamental challenges. Traditional methods such as using ID cards and handwritten signatures are not only inefficient and prone to fraud but also susceptible to errors. In this context, utilizing advanced technologies like facial recognition can be an effective and efficient solution to increase the accuracy and reliability of identity verification in entrance exams. This article examines the use of FaceMatch software for recognizing the faces of university entrance exam candidates in Iran. The main objective of this research is to evaluate the efficiency and accuracy of FaceMatch software in identifying university entrance exam candidates to prevent fraud and ensure the authenticity of individuals' identities. Additionally, this research investigates the advantages and challenges of using this technology in Iran's educational systems. This research was conducted using an experimental method and random sampling. In this study, 1000 university entrance exam candidates in Iran were selected as samples. The facial images of these candidates were processed and analyzed using FaceMatch software. The software's accuracy and efficiency were evaluated using various metrics, including accuracy rate, error rate, and processing time. The research results indicated that FaceMatch software could accurately identify candidates with a precision of 98.5%. The software's error rate was less than 1.5%, demonstrating its high efficiency in facial recognition. Additionally, the average processing time for each candidate's image was less than 2 seconds, indicating the software's high efficiency. Statistical evaluation of the results using precise statistical tests, including analysis of variance (ANOVA) and t-test, showed that the observed differences were significant, and the software's accuracy in identity verification is high. The findings of this research suggest that FaceMatch software can be effectively used as a tool for identifying university entrance exam candidates in Iran. This technology not only enhances security and prevents fraud but also simplifies and streamlines the exam administration process. However, challenges such as preserving candidates' privacy and the costs of implementation must also be considered. The use of facial recognition technology with FaceMatch software in Iran's educational systems can be an effective solution for preventing fraud and ensuring the authenticity of university entrance exam candidates' identities. Given the promising results of this research, it is recommended that this technology be more widely implemented and utilized in the country's educational systems.

Keywords: facial recognition, FaceMatch software, Iran, university entrance exam

Procedia PDF Downloads 40
10606 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement

Authors: Hadi Ardiny, Amir Mohammad Beigzadeh

Abstract:

Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.

Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems

Procedia PDF Downloads 116
10605 Usability Evaluation in Practice: Selecting the Appropriate Method

Authors: Hanan Hayat, Russell Lock

Abstract:

The importance of usability in ensuring software quality has been well established in literature and widely accepted by software development practitioners. Consequently, numerous usability evaluation methods have been developed. However, the availability of large variety of evaluation methods alongside insufficient studies that critically analyse them resulted in an ambiguous process of selection amongst non-usability-expert practitioners. This study investigates the factors affecting the selection of usability evaluation methods within a project by interviewing a software development team. The results of the data gathered are then analysed and integrated in developing a framework. The framework developed poses a solution to the selection processes of usability evaluation methods by adjusting to individual projects resources and goals. It has the potential to be further evaluated to verify its applicability and usability within the domain of this study.

Keywords: usability evaluation, evaluating usability in non-user entered designs, usability evaluation methods (UEM), usability evaluation in projects

Procedia PDF Downloads 153
10604 Bayes Estimation of Parameters of Binomial Type Rayleigh Class Software Reliability Growth Model using Non-informative Priors

Authors: Rajesh Singh, Kailash Kale

Abstract:

In this paper, the Binomial process type occurrence of software failures is considered and failure intensity has been characterized by one parameter Rayleigh class Software Reliability Growth Model (SRGM). The proposed SRGM is mathematical function of parameters namely; total number of failures i.e. η-0 and scale parameter i.e. η-1. It is assumed that very little or no information is available about both these parameters and then considering non-informative priors for both these parameters, the Bayes estimators for the parameters η-0 and η-1 have been obtained under square error loss function. The proposed Bayes estimators are compared with their corresponding maximum likelihood estimators on the basis of risk efficiencies obtained by Monte Carlo simulation technique. It is concluded that both the proposed Bayes estimators of total number of failures and scale parameter perform well for proper choice of execution time.

Keywords: binomial process, non-informative prior, maximum likelihood estimator (MLE), rayleigh class, software reliability growth model (SRGM)

Procedia PDF Downloads 387
10603 The Determinants and Effects of R&D Outsourcing in Korean Manufacturing Firm

Authors: Sangyun Han, Minki Kim

Abstract:

R&D outsourcing is a strategy for acquiring the competitiveness of firms as an open innovation strategy. As increasing total R&D investment of firms, the ratio of amount of R&D outsourcing in it is also increased in Korea. In this paper, we investigate the determinants and effects of R&D outsourcing of firms. Through analyzing the determinants of R&D outsourcing and effect on firm’s performance, we can find some academic and politic issues. Firstly, in the point of academic view, distinguishing the determinants of R&D outsourcing is linked why the firms do open innovation. It can be answered resource based view, core competence theory, and etc. Secondly, we can get some S&T politic implication for transferring the public intellectual properties to private area. Especially, for supporting the more SMEs or ventures, government can get the basement and the reason why and how to make the policies.

Keywords: determinants, effects, R&D, outsourcing

Procedia PDF Downloads 504
10602 Human Dignity as a Source and Limitation of Personal Autonomy

Authors: Jan Podkowik

Abstract:

The article discusses issues of mutual relationships of human dignity and personal autonomy. According to constitutions of many countries and international human rights law, human dignity is a fundamental and inviolable value. It is the source of all freedoms and rights, including personal autonomy. Human dignity, as an inherent, inalienable and non-gradable value comprising an attribute of all people, justifies freedom of action according to one's will and following one's vision of good life. On the other hand, human dignity imposes immanent restrictions to personal autonomy regarding decisions on commercialization of the one’s body, etc. It points to the paradox of dignity – the source of freedom and conditions (basic) of its limitations. The paper shows the theoretical concept of human dignity as an objective value among legal systems, determining the boundaries of legal protection of personal autonomy. It is not, therefore, the relevant perception of human dignity and freedom as opposite values. Reference point has been made the normative provisions of the Polish Constitution and the European Convention on Human Rights and Fundamental Freedoms as well as judgments of constitutional courts.

Keywords: autonomy, constitution, human dignity, human rights

Procedia PDF Downloads 295
10601 Biomass Energy: "The Boon for the Would"

Authors: Shubham Giri Goswami, Yogesh Tiwari

Abstract:

In today’s developing world, India and other countries are developing different instruments and accessories for the better standard and life to be happy and prosper. But rather than this we human-beings have been using different energy sources accordingly, many persons such as scientist, researchers etc have developed many Energy sources like renewable and non-renewable energy sources. Like fossil fuel, coal, gas, petroleum products as non-renewable sources, and solar, wind energy as renewable energy source. Thus all non-renewable energy sources, these all Created pollution as in form of air, water etc. due to ultimate use of these sources by human the future became uncertain. Thus to minimize all this environmental affects and destroy the healthy environment we discovered a solution as renewable energy source. Renewable energy source in form of biomass energy, solar, wind etc. We found different techniques in biomass energy, that good energy source for people. The domestic waste, and is a good source of energy as daily extract from cow in form of dung and many other domestic products naturally can be used eco-friendly fertilizers. Moreover, as from my point of view the cow is able to extract 08-12 kg of dung which can be used to make wormy compost fertilizers. Furthermore, the calf urine as insecticides and use of such a compounds will lead to destroy insects and thus decrease communicable diseases. Therefore, can be used by every person and biomass energy can be in those areas such as rural areas where non-renewable energy sources cannot reach easily. Biomass can be used to develop fertilizers, cow-dung plants and other power generation techniques, and this energy is clean and pollution free and is available everywhere thus saves our beautiful planet or blue or life giving planet called as “EARTH”. We can use the biomass energy, which may be boon for the world in future.

Keywords: biomass, energy, environment, human, pollution, renewable, solar energy, sources, wind

Procedia PDF Downloads 518
10600 Antioxidant Capacity of Different Broccoli Cultivars at Various Harvesting Dates

Authors: S. Graeff-Hönninger, J. Pfenning, V. Gutsal, S. Wolf, S. Zikeli, W. Claupein

Abstract:

Broccoli is considered as being a rich source of AOX like flavonoids, polyphenols, anthocyanins etc. and of major interest especially in the organic sector. However, AOX is environment dependent and often varies between cultivars. Aim of the study was to investigate the impact of cultivar and harvest date on AOX in broccoli. Activity of the AOX was determined using a Photochem®-Analyzer and a kit of reagent solutions for analysis. Results of the study showed that the lipid (ACL) and water-soluble antioxidant potential (AWC) of broccoli heads varied significantly between the four harvesting dates, but not among the different cultivars. The highest concentration of ACL was measured in broccoli heads harvested in September 2011, followed by heads harvested at the beginning of July in 2012. ACW was highest in heads harvested in October 2011. Lowest concentrations of ACW were measured in heads harvested in June 2012. Overall, the study indicated that the harvest date and thus growing conditions seem to be of high importance for final antioxidant capacity of broccoli.

Keywords: broccoli, open-pollinating, harvest date, epidemiological studies

Procedia PDF Downloads 422
10599 Arabic Light Stemmer for Better Search Accuracy

Authors: Sahar Khedr, Dina Sayed, Ayman Hanafy

Abstract:

Arabic is one of the most ancient and critical languages in the world. It has over than 250 million Arabic native speakers and more than twenty countries having Arabic as one of its official languages. In the past decade, we have witnessed a rapid evolution in smart devices, social network and technology sector which led to the need to provide tools and libraries that properly tackle the Arabic language in different domains. Stemming is one of the most crucial linguistic fundamentals. It is used in many applications especially in information extraction and text mining fields. The motivation behind this work is to enhance the Arabic light stemmer to serve the data mining industry and leverage it in an open source community. The presented implementation works on enhancing the Arabic light stemmer by utilizing and enhancing an algorithm that provides an extension for a new set of rules and patterns accompanied by adjusted procedure. This study has proven a significant enhancement for better search accuracy with an average 10% improvement in comparison with previous works.

Keywords: Arabic data mining, Arabic Information extraction, Arabic Light stemmer, Arabic stemmer

Procedia PDF Downloads 305
10598 Development of Concurrent Engineering through the Application of Software Simulations of Metal Production Processing and Analysis of the Effects of Application

Authors: D. M. Eric, D. Milosevic, F. D. Eric

Abstract:

Concurrent engineering technologies are a modern concept in manufacturing engineering. One of the key goals in designing modern technological processes is further reduction of production costs, both in the prototype and the preparatory part, as well as during the serial production. Thanks to many segments of concurrent engineering, these goals can be accomplished much more easily. In this paper, we give an overview of the advantages of using modern software simulations in relation to the classical aspects of designing technological processes of metal deformation. Significant savings are achieved thanks to the electronic simulation and software detection of all possible irregularities in the functional-working regime of the technological process. In order for the expected results to be optimal, it is necessary that the input parameters are very objective and that they reliably represent the values ​of these parameters in real conditions. Since it is a metal deformation treatment here, the particularly important parameters are the coefficient of internal friction between the working material and the tools, as well as the parameters related to the flow curve of the processing material. The paper will give a presentation for the experimental determination of some of these parameters.

Keywords: production technologies, metal processing, software simulations, effects of application

Procedia PDF Downloads 231