Search results for: coordinate modal assurance criterion
927 Short Arc Technique for Baselines Determinations
Authors: Gamal F.Attia
Abstract:
The baselines are the distances and lengths of the chords between projections of the positions of the laser stations on the reference ellipsoid. For the satellite geodesy, it is very important to determine the optimal length of orbital arc along which laser measurements are to be carried out. It is clear that for the dynamical methods long arcs (one month or more) are to be used. According to which more errors of modeling of different physical forces such as earth's gravitational field, air drag, solar radiation pressure, and others that may influence the accuracy of the estimation of the satellites position, at the same time the measured errors con be almost completely excluded and high stability in determination of relative coordinate system can be achieved. It is possible to diminish the influence of the errors of modeling by using short-arcs of the satellite orbit (several revolutions or days), but the station's coordinates estimated by different arcs con differ from each other by a larger quantity than statistical zero. Under the semidynamical ‘short arc’ method one or several passes of the satellite in one of simultaneous visibility from both ends of the chord is known and the estimated parameter in this case is the length of the chord. The comparison of the same baselines calculated with long and short arcs methods shows a good agreement and even speaks in favor of the last one. In this paper the Short Arc technique has been explained and 3 baselines have been determined using the ‘short arc’ method.Keywords: baselines, short arc, dynamical, gravitational field
Procedia PDF Downloads 463926 Representational Issues in Learning Solution Chemistry at Secondary School
Authors: Lam Pham, Peter Hubber, Russell Tytler
Abstract:
Students’ conceptual understandings of chemistry concepts/phenomena involve capability to coordinate across the three levels of Johnston’s triangle model. This triplet model is based on reasoning about chemical phenomena across macro, sub-micro and symbolic levels. In chemistry education, there is a need for further examining inquiry-based approaches that enhance students’ conceptual learning and problem solving skills. This research adopted a directed inquiry pedagogy based on students constructing and coordinating representations, to investigate senior school students’ capabilities to flexibly move across Johnston’ levels when learning dilution and molar concentration concepts. The participants comprise 50 grade 11 and 20 grade 10 students and 4 chemistry teachers who were selected from 4 secondary schools located in metropolitan Melbourne, Victoria. This research into classroom practices used ethnographic methodology, involved teachers working collaboratively with the research team to develop representational activities and lesson sequences in the instruction of a unit on solution chemistry. The representational activities included challenges (Representational Challenges-RCs) that used ‘representational tools’ to assist students to move across Johnson’s three levels for dilution phenomena. In this report, the ‘representational tool’ called ‘cross and portion’ model was developed and used in teaching and learning the molar concentration concept. Students’ conceptual understanding and problem solving skills when learning with this model are analysed through group case studies of year 10 and 11 chemistry students. In learning dilution concepts, students in both group case studies actively conducted a practical experiment, used their own language and visualisation skills to represent dilution phenomena at macroscopic level (RC1). At the sub-microscopic level, students generated and negotiated representations of the chemical interactions between solute and solvent underpinning the dilution process. At the symbolic level, students demonstrated their understandings about dilution concepts by drawing chemical structures and performing mathematical calculations. When learning molar concentration with a ‘cross and portion’ model (RC2), students coordinated across visual and symbolic representational forms and Johnson’s levels to construct representations. The analysis showed that in RC1, Year 10 students needed more ‘scaffolding’ in inducing to representations to explicit the form and function of sub-microscopic representations. In RC2, Year 11 students showed clarity in using visual representations (drawings) to link to mathematics to solve representational challenges about molar concentration. In contrast, year 10 students struggled to get match up the two systems, symbolic system of mole per litre (‘cross and portion’) and visual representation (drawing). These conceptual problems do not lie in the students’ mathematical calculation capability but rather in students’ capability to align visual representations with the symbolic mathematical formulations. This research also found that students in both group case studies were able to coordinate representations when probed about the use of ‘cross and portion’ model (in RC2) to demonstrate molar concentration of diluted solutions (in RC1). Students mostly succeeded in constructing ‘cross and portion’ models to represent the reduction of molar concentration of the concentration gradients. In conclusion, this research demonstrated how the strategic introduction and coordination of chemical representations across modes and across the macro, sub-micro and symbolic levels, supported student reasoning and problem solving in chemistry.Keywords: cross and portion, dilution, Johnston's triangle, molar concentration, representations
Procedia PDF Downloads 137925 Pharmaceutical Scale up for Solid Dosage Forms
Authors: A. Shashank Tiwari, S. P. Mahapatra
Abstract:
Scale-up is defined as the process of increasing batch size. Scale-up of a process viewed as a procedure for applying the same process to different output volumes. There is a subtle difference between these two definitions: batch size enlargement does not always translate into a size increase of the processing volume. In mixing applications, scale-up is indeed concerned with increasing the linear dimensions from the laboratory to the plant size. On the other hand, processes exist (e.g., tableting) where the term ‘scale-up’ simply means enlarging the output by increasing the speed. To complete the picture, one should point out special procedures where an increase of the scale is counterproductive and ‘scale-down’ is required to improve the quality of the product. In moving from Research and Development (R&D) to production scale, it is sometimes essential to have an intermediate batch scale. This is achieved at the so-called pilot scale, which is defined as the manufacturing of drug product by a procedure fully representative of and simulating that used for full manufacturing scale. This scale also makes it possible to produce enough products for clinical testing and to manufacture samples for marketing. However, inserting an intermediate step between R&D and production scales does not, in itself, guarantee a smooth transition. A well-defined process may generate a perfect product both in the laboratory and the pilot plant and then fail quality assurance tests in production.Keywords: scale up, research, size, batch
Procedia PDF Downloads 413924 Enhancing the Performance of Bug Reporting System by Handling Duplicate Reporting Reports: Artificial Intelligence Based Mantis
Authors: Afshan Saad, Muhammad Saad, Shah Muhammad Emaduddin
Abstract:
Bug reporting systems are most important tool that guides regarding different maintenance activities in software engineering. Duplicate bug reports which describe the bugs and issues in bug reporting system repository increases processing time of bug triage that monitors all such activities and software programmers who are working and spending time on reports which were assigned by triage. These reports can reveal imperfections and degrade software quality. As there is a number of the potential duplicate bug reports increases, the number of bug reports in bug repository increases. Identifying duplicate bug reports help in decreasing development work load in fixing defects. However, it is difficult to manually identify all possible duplicates because of the huge number of already reported bug reports. In this paper, an artificial intelligence based system using Mantis is proposed to automatically detect duplicate bug reports. When new bugs are submitted to repository triages will mark it with a tag. It will investigate that whether it is a duplicate of an existing bug report by matching or not. Reports with duplicate tags will be eliminated from the repository which not only will improve the performance of the system but can also save cost and effort waste on bug triage and finding the duplicate bug.Keywords: bug tracking, triager, tool, quality assurance
Procedia PDF Downloads 193923 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program
Authors: Ming Wen, Nasim Nezamoddini
Abstract:
Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.Keywords: finite element analysis, FEA, random vibration fatigue, process automation, analytical hierarchy process, AHP, TOPSIS, multiple-criteria decision-making, MCDM
Procedia PDF Downloads 112922 A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database
Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami
Abstract:
The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.Keywords: pattern recognition, global terrorism database, Manhattan distance, k-means clustering, terrorism data analysis
Procedia PDF Downloads 386921 Pure Economic Loss: A Trouble Child
Authors: Isabel Mousinho de Figueiredo
Abstract:
Pure economic loss can be brought into the 21st century and become a useful tool to keep the tort of negligence within reasonable limits, provided the concept is minutely reexamined. The term came about when wealth was physical, and Law wanted to be a modern science. As a tool to draw the line, it leads to satisfactory decisions in most cases, but needlessly creates distressing conundrums in others, and these are the ones parties bother to litigate about. Economic loss is deemed to be pure based on a blind negative criterion of physical harm, that inadvertently smelts vastly disparate problems into an indiscernible mass, with arbitrary outcomes. These shortcomings are usually dismissed as minor byproducts, for the lack of a better formula. Law could instead stick to the sound paradigms of the intended rule, and be more specific in identifying the losses deserving of compensation. This would provide a better service to Bench and Bar, and effectively assist everyone navigating the many challenges of Accident Law.Keywords: accident law, comparative tort law, negligence, pure economic loss
Procedia PDF Downloads 116920 Audit on the Use of T-MACS Decision Aid for Patients Presenting to ED with Chest Pain
Authors: Saurav Dhawan, Sanchit Bansal
Abstract:
Background T-MACS is a computer-based decision aid that ‘rules in’ and ‘rules out’ ACS using a combination of the presence or absence of six clinical features with only one biomarker measured on arrival: hs-cTnT. T-MACS had 99.3% negative predictive value and 98.7% sensitivity for ACS, ‘ruling out’ ACS in 40% of patients while ‘ruling in’ 5% at the highest risk. We aim at benchmarking the use of T-MACS which could help to conserve healthcare resources, facilitate early discharges, and ensure safe practice. Methodology Randomized retrospective data collection (n=300) was done from ED electronic records across 3 hospital sites within MFT over a period of 2 months. Data was analysed and compared by percentage for the usage of T-MACS, number of admissions/discharges, and in days for length of stay in hospital. Results MRI A&E had the maximum compliance with the use of T-MACS in the trust at 66%, with minimum admissions (44%) and an average length of stay of 1.825 days. NMG A&E had an extremely low compliance rate (8 %), with 75% admission and 3.387 days as the average length of stay. WYT A&E had no TMACS recorded, with a maximum of 79% admissions and the longest average length of stay at 5.07 days. Conclusion All three hospital sites had a RAG rating of ‘RED’ as per the compliance levels. The assurance level was calculated as ‘Very Limited’ across all sites. There was a positive correlation observed between compliance with TMACS and direct discharges from ED, thereby reducing the average length of stay for patients in the hospital.Keywords: ACS, discharges, ED, T-MACS
Procedia PDF Downloads 58919 Mountain Photo Sphere: An Android Application of Mountain Hiking Street View
Authors: Yanto Budisusanto, Aulia Rachmawati
Abstract:
Land navigation technology that is being developed is Google Street View to provide 360° street views, enabling the user to know the road conditions physically with the photo display. For climbers, especially beginners, detail information of climbing terrain is needed so climbers can prepare supplies and strategies before climbing. Therefore, we built a mountaineer guide application named Mountain Photo Sphere. This application displays a 360̊ panoramic view of mountain hiking trail and important points along the hiking path and its surrounding conditions. By combining panoramic photos 360̊ and tracking paths from coordinate data, a virtual tour will be formed. It is built using Java language and Android Studio. The hiking trail map composed by Google Maps API (Gaining access to google maps), Google GEO API (Gaining access to google maps), and OpenStreetMap API (Getting map files to be accessed offline on the Application). This application can be accessed offline so that climbers can use the application during climbing activities.Keywords: google street view, panoramic photo 360°, mountain hiking, mountain photo sphere
Procedia PDF Downloads 166918 The Dynamics of Unsteady Squeezing Flow between Parallel Plates (Two-Dimensional)
Authors: Jiya Mohammed, Ibrahim Ismail Giwa
Abstract:
Unsteady squeezing flow of a viscous fluid between parallel plates is considered. The two plates are considered to be approaching each other symmetrically, causing the squeezing flow. Two-dimensional rectangular Cartesian coordinate is considered. The Navier-Stokes equation was reduced using similarity transformation to a single fourth order non-linear ordinary differential equation. The energy equation was transformed to a second order coupled differential equation. We obtained solution to the resulting ordinary differential equations via Homotopy Perturbation Method (HPM). HPM deforms a differential problem into a set of problem that are easier to solve and it produces analytic approximate expression in the form of an infinite power series by using only sixth and fifth terms for the velocity and temperature respectively. The results reveal that the proposed method is very effective and simple. Comparisons among present and existing solutions were provided and it is shown that the proposed method is in good agreement with Variation of Parameter Method (VPM). The effects of appropriate dimensionless parameters on the velocity profiles and temperature field are demonstrated with the aid of comprehensive graphs and tables.Keywords: coupled differential equation, Homotopy Perturbation Method, plates, squeezing flow
Procedia PDF Downloads 474917 Stable Tending Control of Complex Power Systems: An Example of Localized Design of Power System Stabilizers
Authors: Wenjuan Du
Abstract:
The phase compensation method was proposed based on the concept of the damping torque analysis (DTA). It is a method for the design of a PSS (power system stabilizer) to suppress local-mode power oscillations in a single-machine infinite-bus power system. This paper presents the application of the phase compensation method for the design of a PSS in a multi-machine power system. The application is achieved by examining the direct damping contribution of the stabilizer to the power oscillations. By using linearized equal area criterion, a theoretical proof to the application for the PSS design is presented. Hence PSS design in the paper is an example of stable tending control by localized method.Keywords: phase compensation method, power system small-signal stability, power system stabilizer
Procedia PDF Downloads 640916 Integrated Mathematical Modeling and Advance Visualization of Magnetic Nanoparticle for Drug Delivery, Drug Release and Effects to Cancer Cell Treatment
Authors: Norma Binti Alias, Che Rahim Che The, Norfarizan Mohd Said, Sakinah Abdul Hanan, Akhtar Ali
Abstract:
This paper discusses on the transportation of magnetic drug targeting through blood within vessels, tissues and cells. There are three integrated mathematical models to be discussed and analyze the concentration of drug and blood flow through magnetic nanoparticles. The cell therapy brought advancement in the field of nanotechnology to fight against the tumors. The systematic therapeutic effect of Single Cells can reduce the growth of cancer tissue. The process of this nanoscale phenomena system is able to measure and to model, by identifying some parameters and applying fundamental principles of mathematical modeling and simulation. The mathematical modeling of single cell growth depends on three types of cell densities such as proliferative, quiescent and necrotic cells. The aim of this paper is to enhance the simulation of three types of models. The first model represents the transport of drugs by coupled partial differential equations (PDEs) with 3D parabolic type in a cylindrical coordinate system. This model is integrated by Non-Newtonian flow equations, leading to blood liquid flow as the medium for transportation system and the magnetic force on the magnetic nanoparticles. The interaction between the magnetic force on drug with magnetic properties produces induced currents and the applied magnetic field yields forces with tend to move slowly the movement of blood and bring the drug to the cancer cells. The devices of nanoscale allow the drug to discharge the blood vessels and even spread out through the tissue and access to the cancer cells. The second model is the transport of drug nanoparticles from the vascular system to a single cell. The treatment of the vascular system encounters some parameter identification such as magnetic nanoparticle targeted delivery, blood flow, momentum transport, density and viscosity for drug and blood medium, intensity of magnetic fields and the radius of the capillary. Based on two discretization techniques, finite difference method (FDM) and finite element method (FEM), the set of integrated models are transformed into a series of grid points to get a large system of equations. The third model is a single cell density model involving the three sets of first order PDEs equations for proliferating, quiescent and necrotic cells change over time and space in Cartesian coordinate which regulates under different rates of nutrients consumptions. The model presents the proliferative and quiescent cell growth depends on some parameter changes and the necrotic cells emerged as the tumor core. Some numerical schemes for solving the system of equations are compared and analyzed. Simulation and computation of the discretized model are supported by Matlab and C programming languages on a single processing unit. Some numerical results and analysis of the algorithms are presented in terms of informative presentation of tables, multiple graph and multidimensional visualization. As a conclusion, the integrated of three types mathematical modeling and the comparison of numerical performance indicates that the superior tool and analysis for solving the complete set of magnetic drug delivery system which give significant effects on the growth of the targeted cancer cell.Keywords: mathematical modeling, visualization, PDE models, magnetic nanoparticle drug delivery model, drug release model, single cell effects, avascular tumor growth, numerical analysis
Procedia PDF Downloads 428915 Channel Estimation for LTE Downlink
Authors: Rashi Jain
Abstract:
The LTE systems employ Orthogonal Frequency Division Multiplexing (OFDM) as the multiple access technology for the Downlink channels. For enhanced performance, accurate channel estimation is required. Various algorithms such as Least Squares (LS), Minimum Mean Square Error (MMSE) and Recursive Least Squares (RLS) can be employed for the purpose. The paper proposes channel estimation algorithm based on Kalman Filter for LTE-Downlink system. Using the frequency domain pilots, the initial channel response is obtained using the LS criterion. Then Kalman Filter is employed to track the channel variations in time-domain. To suppress the noise within a symbol, threshold processing is employed. The paper draws comparison between the LS, MMSE, RLS and Kalman filter for channel estimation. The parameters for evaluation are Bit Error Rate (BER), Mean Square Error (MSE) and run-time.Keywords: LTE, channel estimation, OFDM, RLS, Kalman filter, threshold
Procedia PDF Downloads 355914 Wally Feelings Test: Validity and Reliability Study
Authors: Gökhan Kayili, Ramazan Ari
Abstract:
In this research, it is aimed to be adapted Wally Feelings Test to Turkish children and performed the reliability and validity analysis of the test. The sampling of the research was composed of three to five year-old 699 Turkish preschoolers who are attending official and private nursery school. The schools selected with simple random sampling method by considering different socio economic conditions and different central district in Konya. In order to determine reliability of Wally Feelings Test, internal consistency coefficients (KR-20), split-half reliability and test- retest reliability analysis have been performed. During validation process construct validity, content/scope validity and concurrent/criterion validity were used. When validity and reliability of the test examined, it is seen that Wally Feelings Test is a valid and reliable instrument to evaluate three to five year old Turkish children’s understanding feeling skills.Keywords: reliability, validity, wally feelings test, social sciences
Procedia PDF Downloads 538913 Extended Constraint Mask Based One-Bit Transform for Low-Complexity Fast Motion Estimation
Authors: Oğuzhan Urhan
Abstract:
In this paper, an improved motion estimation (ME) approach based on weighted constrained one-bit transform is proposed for block-based ME employed in video encoders. Binary ME approaches utilize low bit-depth representation of the original image frames with a Boolean exclusive-OR based hardware efficient matching criterion to decrease computational burden of the ME stage. Weighted constrained one-bit transform (WC‑1BT) based approach improves the performance of conventional C-1BT based ME employing 2-bit depth constraint mask instead of a 1-bit depth mask. In this work, the range of constraint mask is further extended to increase ME performance of WC-1BT approach. Experiments reveal that the proposed method provides better ME accuracy compared existing similar ME methods in the literature.Keywords: fast motion estimation; low-complexity motion estimation, video coding
Procedia PDF Downloads 316912 Useful Lifetime Prediction of Chevron Rubber Spring for Railway Vehicle
Authors: Chang Su Woo, Hyun Sung Park
Abstract:
Useful lifetime evaluation of chevron rubber spring was very important in design procedure to assure the safety and reliability. It is, therefore, necessary to establish a suitable criterion for the replacement period of chevron rubber spring. In this study, we performed characteristic analysis and useful lifetime prediction of chevron rubber spring. Rubber material coefficient was obtained by curve fittings of uni-axial tension, equi bi-axial tension and pure shear test. Computer simulation was executed to predict and evaluate the load capacity and stiffness for chevron rubber spring. In order to useful lifetime prediction of rubber material, we carried out the compression set with heat aging test in an oven at the temperature ranging from 50°C to 100°C during a period 180 days. By using the Arrhenius plot, several useful lifetime prediction equations for rubber material was proposed.Keywords: chevron rubber spring, material coefficient, finite element analysis, useful lifetime prediction
Procedia PDF Downloads 567911 Multimodal Direct Neural Network Positron Emission Tomography Reconstruction
Authors: William Whiteley, Jens Gregor
Abstract:
In recent developments of direct neural network based positron emission tomography (PET) reconstruction, two prominent architectures have emerged for converting measurement data into images: 1) networks that contain fully-connected layers; and 2) networks that primarily use a convolutional encoder-decoder architecture. In this paper, we present a multi-modal direct PET reconstruction method called MDPET, which is a hybrid approach that combines the advantages of both types of networks. MDPET processes raw data in the form of sinograms and histo-images in concert with attenuation maps to produce high quality multi-slice PET images (e.g., 8x440x440). MDPET is trained on a large whole-body patient data set and evaluated both quantitatively and qualitatively against target images reconstructed with the standard PET reconstruction benchmark of iterative ordered subsets expectation maximization. The results show that MDPET outperforms the best previously published direct neural network methods in measures of bias, signal-to-noise ratio, mean absolute error, and structural similarity.Keywords: deep learning, image reconstruction, machine learning, neural network, positron emission tomography
Procedia PDF Downloads 110910 Practical Methods for Automatic MC/DC Test Cases Generation of Boolean Expressions
Authors: Sekou Kangoye, Alexis Todoskoff, Mihaela Barreau
Abstract:
Modified Condition/Decision Coverage (MC/DC) is a structural coverage criterion that aims to prove that all conditions involved in a Boolean expression can influence the result of that expression. In the context of automotive, MC/DC is highly recommended and even required for most security and safety applications testing. However, due to complex Boolean expressions that often embedded in those applications, generating a set of MC/DC compliant test cases for any of these expressions is a nontrivial task and can be time consuming for testers. In this paper we present an approach to automatically generate MC/DC test cases for any Boolean expression. We introduce novel techniques, essentially based on binary trees to quickly and optimally generate MC/DC test cases for the expressions. Thus, the approach can be used to reduce the manual testing effort of testers.Keywords: binary trees, MC/DC, test case generation, nontrivial task
Procedia PDF Downloads 447909 Continuous and Discontinuos Modeling of Wellbore Instability in Anisotropic Rocks
Authors: C. Deangeli, P. Obentaku Obenebot, O. Omwanghe
Abstract:
The study focuses on the analysis of wellbore instability in rock masses affected by weakness planes. The occurrence of failure in such a type of rocks can occur in the rock matrix and/ or along the weakness planes, in relation to the mud weight gradient. In this case the simple Kirsch solution coupled with a failure criterion cannot supply a suitable scenario for borehole instabilities. Two different numerical approaches have been used in order to investigate the onset of local failure at the wall of a borehole. For each type of approach the influence of the inclination of weakness planes has been investigates, by considering joint sets at 0°, 35° and 90° to the horizontal. The first set of models have been carried out with FLAC 2D (Fast Lagrangian Analysis of Continua) by considering the rock material as a continuous medium, with a Mohr Coulomb criterion for the rock matrix and using the ubiquitous joint model for accounting for the presence of the weakness planes. In this model yield may occur in either the solid or along the weak plane, or both, depending on the stress state, the orientation of the weak plane and the material properties of the solid and weak plane. The second set of models have been performed with PFC2D (Particle Flow code). This code is based on the Discrete Element Method and considers the rock material as an assembly of grains bonded by cement-like materials, and pore spaces. The presence of weakness planes is simulated by the degradation of the bonds between grains along given directions. In general the results of the two approaches are in agreement. However the discrete approach seems to capture more complex phenomena related to local failure in the form of grain detachment at wall of the borehole. In fact the presence of weakness planes in the discontinuous medium leads to local instability along the weak planes also in conditions not predicted from the continuous solution. In general slip failure locations and directions do not follow the conventional wellbore breakout direction but depend upon the internal friction angle and the orientation of the bedding planes. When weakness plane is at 0° and 90° the behaviour are similar to that of a continuous rock material, but borehole instability is more severe when weakness planes are inclined at an angle between 0° and 90° to the horizontal. In conclusion, the results of the numerical simulations show that the prediction of local failure at the wall of the wellbore cannot disregard the presence of weakness planes and consequently the higher mud weight required for stability for any specific inclination of the joints. Despite the discrete approach can simulate smaller areas because of the large number of particles required for the generation of the rock material, however it seems to investigate more correctly the occurrence of failure at the miscroscale and eventually the propagation of the failed zone to a large portion of rock around the wellbore.Keywords: continuous- discontinuous, numerical modelling, weakness planes wellbore, FLAC 2D
Procedia PDF Downloads 499908 Track Initiation Method Based on Multi-Algorithm Fusion Learning of 1DCNN And Bi-LSTM
Abstract:
Aiming at the problem of high-density clutter and interference affecting radar detection target track initiation in ECM and complex radar mission, the traditional radar target track initiation method has been difficult to adapt. To this end, we propose a multi-algorithm fusion learning track initiation algorithm, which transforms the track initiation problem into a true-false track discrimination problem, and designs an algorithm based on 1DCNN(One-Dimensional CNN)combined with Bi-LSTM (Bi-Directional Long Short-Term Memory )for fusion classification. The experimental dataset consists of real trajectories obtained from a certain type of three-coordinate radar measurements, and the experiments are compared with traditional trajectory initiation methods such as rule-based method, logical-based method and Hough-transform-based method. The simulation results show that the overall performance of the multi-algorithm fusion learning track initiation algorithm is significantly better than that of the traditional method, and the real track initiation rate can be effectively improved under high clutter density with the average initiation time similar to the logical method.Keywords: track initiation, multi-algorithm fusion, 1DCNN, Bi-LSTM
Procedia PDF Downloads 94907 Attitude, Practice, and Prevalence of Injuries among Building Construction Workers in Lagos State
Authors: O. J. Makinde, O. A. Abiola
Abstract:
Background: Hazards and injuries are two common phenomena that have been associated with the building construction profession. Apart from injuries, deaths from injuries sustained at work have been high in this profession. This study, therefore, attempts to determine the attitude, practice, and prevalence of injuries among this group of workers. Methods: This was a cross-sectional study with 285 respondents. The sampling was multi-staged. Interviewer-administered questionnaires were used to elicit information such as socio-demographic characteristics of the respondents, attitude and practice of occupational safety and prevalence of injuries among the workers. The data were analyzed using epi-info 3.5.1 statistical software. Result: The modal age group is 25-34yrs which made up 40% of the respondents. Most of the respondents were male (86.3%). Most of the respondent (52.3%) have their highest educational level as the secondary school. Most of the respondents (64.9%) had a poor attitude to occupational safety while 91.6% had poor occupational safety practices. The prevalence of occupational injury was very high (64.9%). Particles in the eyes have the highest prevalence (52.3%) while electric shock has the least prevalence (19.6%).None of the respondent working at a height used safety belt while working. Conclusion: Attitude and practice of occupational safety are poor among this group of workers and prevalence of injuries was high.Keywords: building, construction, Hazard, injury, workers
Procedia PDF Downloads 371906 Comprehensive Risk Assessment Model in Agile Construction Environment
Authors: Jolanta Tamošaitienė
Abstract:
The article focuses on a developed comprehensive model to be used in an agile environment for the risk assessment and selection based on multi-attribute methods. The model is based on a multi-attribute evaluation of risk in construction, and the determination of their optimality criterion values are calculated using complex Multiple Criteria Decision-Making methods. The model may be further applied to risk assessment in an agile construction environment. The attributes of risk in a construction project are selected by applying the risk assessment condition to the construction sector, and the construction process efficiency in the construction industry accounts for the agile environment. The paper presents the comprehensive risk assessment model in an agile construction environment. It provides a background and a description of the proposed model and the developed analysis of the comprehensive risk assessment model in an agile construction environment with the criteria.Keywords: assessment, environment, agile, model, risk
Procedia PDF Downloads 255905 Solubility Measurements in the Context of Nanoregulation
Authors: Ratna Tantra
Abstract:
From a risk assessment point of view, solubility is a property that has been identified as being important. If nanomaterial is completely soluble, then its disposal can be treated much in the same way as ‘ordinary’ chemicals, which subsequently will simplify testing and characterization regimes. The measurement of solubility has been highlighted as important in a pan-European project, Framework Programme (FP) 7 NANoREG. Some of the project outputs surrounding this topic will be presented here, in which there are two parts. First, a review on existing methods capable of measuring nanomaterial solubility will be discussed. Second, a case study will be presented based on using colorimetry methods to quantify dissolve zinc from ZnO nanomaterial upon exposure to digestive juices. The main findings are as follows: a) there is no universal method for nanomaterial solubility testing. The method chosen will be dependent on sample type and nano-specific application/scenario. b) The colorimetry results show a positive correlation between particle concentration and amount of [Zn2+] released; this was expected c) results indicate complete dissolution of the ZnO nanomaterial, as a result of the digestion protocol but only a fraction existing as free ions. Finally, what differentiates the F7 NANoREG project over other projects is the need for participating research laboratories to follow a set of defined protocols, necessary to establish quality control and assurance. The methods and results associated with mandatory testing that carried out by all partners in NANoREG will be discussed.Keywords: nanomaterials, nanotoxicology, solubility, zinc oxide
Procedia PDF Downloads 335904 The Development of the Self-concept Scale for Elders in Taiwan
Authors: Ting-Chia Lien, Tzu-Yin Yen, Szu-Fan Chen, Tai-chun Kuo, Hung-Tse Lin, Yi-Chen Chung, Hock-Sen Gwee
Abstract:
The purpose of this study was to explore the result of the survey by developing “Self-Concept Scale for Elders”, which could provide community counseling and guidance institution for practical application. The sample of this study consisted of 332 elders in Taiwan (male: 33.4%; female: 66.6%). The mean age of participants was 65-98 years. The measurements applied in this study is “Self-Concept Scale for Elders”. After item and factor analyses, the preliminary version of the Self-Concept Scale for Elders was revised to the final version. The results were summarized as follows: 1) There were 10 items in Self-Concept Scale for Elders. 2) The variance explained for the scale accounted for 77.15%, with corrected item-total correlations Cronbach’s alpha=0.87. 3) The content validity, criterion validity and construct validity have been found to be satisfactory. Based on the findings, the implication and suggestions are offered for reference regarding counselor education and future research.Keywords: self-concept, elder, development scale, applied psychology
Procedia PDF Downloads 570903 C-Spine Imaging in a Non-trauma Centre: Compliance with NEXUS Criteria Audit
Authors: Andrew White, Abigail Lowe, Kory Watkins, Hamed Akhlaghi, Nicole Winter
Abstract:
The timing and appropriateness of diagnostic imaging are critical to the evaluation and management of traumatic injuries. Within the subclass of trauma patients, the prevalence of c-spine injury is less than 4%. However, the incidence of delayed diagnosis within this cohort has been documented as up to 20%, with inadequate radiological examination most cited issue. In order to assess those in which c-spine injury cannot be fully excluded based on clinical examination alone and, therefore, should undergo diagnostic imaging, a set of criteria is used to provide clinical guidance. The NEXUS (National Emergency X-Radiography Utilisation Study) criteria is a validated clinical decision-making tool used to facilitate selective c-spine radiography. The criteria allow clinicians to determine whether cervical spine imaging can be safely avoided in appropriate patients. The NEXUS criteria are widely used within the Emergency Department setting given their ease of use and relatively straightforward application and are used in the Victorian State Trauma System’s guidelines. This audit utilized retrospective data collection to examine the concordance of c-spine imaging in trauma patients to that of the NEXUS criteria and assess compliance with state guidance on diagnostic imaging in trauma. Of the 183 patients that presented with trauma to the head, neck, or face (244 excluded due to incorrect triage), 98 did not undergo imaging of the c-spine. Out of those 98, 44% fulfilled at least one of the NEXUS criteria, meaning the c-spine could not be clinically cleared as per the current guidelines. The criterion most met was intoxication, comprising 42% (18 of 43), with midline spinal tenderness (or absence of documentation of this) the second most common with 23% (10 of 43). Intoxication being the most met criteria is significant but not unexpected given the cohort of patients seen at St Vincent’s and within many emergency departments in general. Given these patients will always meet NEXUS criteria, an element of clinical judgment is likely needed, or concurrent use of the Canadian C-Spine Rules to exclude the need for imaging. Midline tenderness as a met criterion was often in the context of poor or absent documentation relating to this, emphasizing the importance of clear and accurate assessments. The distracting injury was identified in 7 out of the 43 patients; however, only one of these patients exhibited a thoracic injury (T11 compression fracture), with the remainder comprising injuries to the extremities – some studies suggest that C-spine imaging may not be required in the evaluable blunt trauma patient despite distracting injuries in any body regions that do not involve the upper chest. This emphasises the need for standardised definitions for distracting injury, at least at a departmental/regional level. The data highlights the currently poor application of the NEXUS guidelines, with likely common themes throughout emergency departments, highlighting the need for further education regarding implementation and potential refinement/clarification of criteria. Of note, there appeared to be no significant differences between levels of experience with respect to inappropriately clearing the c-spine clinically with respect to the guidelines.Keywords: imaging, guidelines, emergency medicine, audit
Procedia PDF Downloads 72902 The Significance of Community Life in Promoting Unity in the Light of Acts 2:42
Authors: Takesure Mahohoma
Abstract:
Community life is an epitome of the African axiom 'I am because we are, since we are therefore I am.' This culminates in the Ubuntu philosophy which is summarized in the Zulu words, 'umuntu ngumuntu ngabantu' (A person is a person through other people). This relationship gives honour to all people. This is the gist of the paper. This paper seeks to demonstrate the impact of community life in promoting unity from an African perspective. Using the proto-community in Acts 2:42, it is argued that community life is a solution to many social problems that divide African society today. The aim is to encourage all Africans and other people to cultivate a sense of belonging and valuing community life in the light of Acts 2:42. Hence we shall trace this theme from Old Testament, New Testament, and Christian history. The other section touches on the essence of community life and obstacles that hinder it. We shall offer spiritual suggestions and an integrative reflection. The nature of the paper is theology in general but spiritual in particular. As a spiritual paper, it is guided by the foundational approach. Thus, it employs the dialogical and integrative reflection method. The expected result is that freedom from all the miseries experienced is brought by living a community life. This is a life that gives greater assurance of enough food, education, health, peace, employment, and increased responsibility that values human dignity. Thus people are neighbours to each other. There is no stranger among them. The basic presumption is that there can be no development in any society without community life.Keywords: community, seged, koinonia, neighbor
Procedia PDF Downloads 287901 Labor Income Share Change and Mergers and Acquisitions: Empirical Evidence of the Importance of Employees
Authors: Jie Zhang, Chaomin Zhang
Abstract:
Mergers and Acquisitions (M&A) are important market tools to support economic transformation and upgrading to achieve high-quality development. Based on the employee value distribution in the context of M&A and reorganization of Chinese enterprises, this paper takes China's A-share listed companies from 2007 to 2022 as research samples to explore the impact of employee labor income share fluctuation on the success rate of M&A. The research finds that, first, when employees of the target party expect the share of labor income to decline after the merger, it will significantly inhibit the success rate of the merger. Second, when there is a vertical gap (that is, the target party has a larger scale and a higher level of corporate governance) or a horizontal gap (that is, the merger parties are in different industries and strategies) .Third, for enterprises that have completed the M&A process, the decline of labor income share will lead to higher post-M&A goodwill impairment. The research conclusions of this paper enrich the literature on the economic consequences of labor income share and the influencing factors of M&A, and provide useful reference for enterprises to better coordinate the value distribution of employees in M&A.Keywords: labor income share, the success rate of M&A, value distribution, goodwill impairment
Procedia PDF Downloads 18900 Design of Membership Ranges for Fuzzy Logic Control of Refrigeration Cycle Driven by a Variable Speed Compressor
Authors: Changho Han, Jaemin Lee, Li Hua, Seokkwon Jeong
Abstract:
Design of membership function ranges in fuzzy logic control (FLC) is presented for robust control of a variable speed refrigeration system (VSRS). The criterion values of the membership function ranges can be carried out from the static experimental data, and two different values are offered to compare control performance. Some simulations and real experiments for the VSRS were conducted to verify the validity of the designed membership functions. The experimental results showed good agreement with the simulation results, and the error change rate and its sampling time strongly affected the control performance at transient state of the VSRS.Keywords: variable speed refrigeration system, fuzzy logic control, membership function range, control performance
Procedia PDF Downloads 265899 Hybrid Concrete Construction (HCC) for Sustainable Infrastructure Development in Nigeria
Authors: Muhammad Bello Ibrahim, M. Auwal Zakari, Aliyu Usman
Abstract:
Hybrid concrete construction (HCC) combines all the benefits of pre-casting with the advantages of cast in-situ construction. Merging the two, as a hybrid structure, results in even greater construction speed, value, and the overall economy. Its variety of uses has gained popularity in the United States and in Europe due to its distinctive benefits. However, the increase of its application in some countries (including Nigeria) has been relatively slow. Several researches have shown that hybrid construction offers an ultra-high performance concrete that offers superior strength, durability and aesthetics with design flexibility and within sustainability credentials, based on the available and economically visible technologies. This paper examines and documents the criterion that will help inform the process of deciding whether or not to adopt hybrid concrete construction (HCC) technology rather than more traditional alternatives. It also the present situation of design, construction and research on hybrid structures.Keywords: hybrid concrete construction, Nigeria, sustainable infrastructure development, design flexibility
Procedia PDF Downloads 561898 Resolving Conflicts of Constitutional Nature: Inside the Romanian Constitutional Court's Rulings on the Role and Competencies of the Public Authorities
Authors: Marieta Safta
Abstract:
The separation and balance of state powers constitute the basis of the rule of law. Observance of this principle requires framing of public authorities within the limits of competence established by the Constitution and the law, as well as loyal cooperation between them. From this perspective, the attribution of the constitutional courts for settling legal conflicts of a constitutional nature is an important tool for correcting the tendencies of violation of these limits, as well as for identifying solutions for situations that do not find an explicit regulation in the constitutional texts. The present study analyzes the jurisprudence of the Constitutional Court of Romania in the field of legal conflicts of a constitutional nature, revealing, together with the presentation of conflict situations, the vulnerabilities of the constitutional reference texts. It is also highlighted the role of the constitutional courts in the evolution of constitutional law institutions, even in terms of defining and redefining the regime of the forms of government. The conclusion of the study, beyond the subject of legal conflicts of a constitutional nature, bears on the necessity, even more so in this matter, of the certainty of jurisdictional interpretation. This certainty cannot be achieved as long as the interpretation is not authoritative; consequently, the assurance of the effectiveness of constitutional justice constitute a key issue of the rule of law.Keywords: legal conflicts of constitutional nature, the Constitutional Court of Romania, the separation and balance of powers in the state, the effectiveness of constitutional justice
Procedia PDF Downloads 128