Search results for: SoC soft error rate
10266 Osteitis in the Diabetic Foot and the Risk Factor on the Population
Authors: Mohamed Amine Adaour, Mohamed Sadek Bachene, Mosaab Fortassi, Wafaa Siouda
Abstract:
Foot infections are responsible for a significant number of hospitalizations and amputations in diabetic patients. The objective of our study is to analyze and evaluate the management of diabetic foot in a surgical setting. A retrospective study was conducted based on a selected case of suspected diabetic foot infections of osteitis treated at the Mohamed Boudiaf hospital in Medea.The case was reiterated as a therapeutic charge, consisting of treating first the infection of the soft tissues, then the osteitis: biopsy after at least 15 days of cessation of antibiotic therapy. Successful treatment of osteitis was defined at the end of a follow-up period of complete wound healing, lack of bone resection/amputation surgery at the initial bone site during follow-up , Instead, biopsies are prescribed in the treatment of soft tissue infection. The mean duration of treatment for soft tissue infection was 2-3 weeks, the duration of the antibiotic-free window of therapy prior to bone biopsy was 2-4 weeks. This patient received medical management without surgical resection. The success rate for treating osteitis at one year was 73%, and healing at one year was 88%.It is often limited to a sausage of the foot at the cost of repeated amputations. The best management remains prevention, which necessarily involves setting up a specialized and adapted centre.Keywords: osteitis, antibiotic, biopsy, diabetic foot
Procedia PDF Downloads 9910265 Mixed Integer Programming-Based One-Class Classification Method for Process Monitoring
Authors: Younghoon Kim, Seoung Bum Kim
Abstract:
One-class classification plays an important role in detecting outlier and abnormality from normal observations. In the previous research, several attempts were made to extend the scope of application of the one-class classification techniques to statistical process control problems. For most previous approaches, such as support vector data description (SVDD) control chart, the design of the control limits is commonly based on the assumption that the proportion of abnormal observations is approximately equal to an expected Type I error rate in Phase I process. Because of the limitation of the one-class classification techniques based on convex optimization, we cannot make the proportion of abnormal observations exactly equal to expected Type I error rate: controlling Type I error rate requires to optimize constraints with integer decision variables, but convex optimization cannot satisfy the requirement. This limitation would be undesirable in theoretical and practical perspective to construct effective control charts. In this work, to address the limitation of previous approaches, we propose the one-class classification algorithm based on the mixed integer programming technique, which can solve problems formulated with continuous and integer decision variables. The proposed method minimizes the radius of a spherically shaped boundary subject to the number of normal data to be equal to a constant value specified by users. By modifying this constant value, users can exactly control the proportion of normal data described by the spherically shaped boundary. Thus, the proportion of abnormal observations can be made theoretically equal to an expected Type I error rate in Phase I process. Moreover, analogous to SVDD, the boundary can be made to describe complex structures by using some kernel functions. New multivariate control chart applying the effectiveness of the algorithm is proposed. This chart uses a monitoring statistic to characterize the degree of being an abnormal point as obtained through the proposed one-class classification. The control limit of the proposed chart is established by the radius of the boundary. The usefulness of the proposed method was demonstrated through experiments with simulated and real process data from a thin film transistor-liquid crystal display.Keywords: control chart, mixed integer programming, one-class classification, support vector data description
Procedia PDF Downloads 17410264 Error Detection and Correction for Onboard Satellite Computers Using Hamming Code
Authors: Rafsan Al Mamun, Md. Motaharul Islam, Rabana Tajrin, Nabiha Noor, Shafinaz Qader
Abstract:
In an attempt to enrich the lives of billions of people by providing proper information, security and a way of communicating with others, the need for efficient and improved satellites is constantly growing. Thus, there is an increasing demand for better error detection and correction (EDAC) schemes, which are capable of protecting the data onboard the satellites. The paper is aimed towards detecting and correcting such errors using a special algorithm called the Hamming Code, which uses the concept of parity and parity bits to prevent single-bit errors onboard a satellite in Low Earth Orbit. This paper focuses on the study of Low Earth Orbit satellites and the process of generating the Hamming Code matrix to be used for EDAC using computer programs. The most effective version of Hamming Code generated was the Hamming (16, 11, 4) version using MATLAB, and the paper compares this particular scheme with other EDAC mechanisms, including other versions of Hamming Codes and Cyclic Redundancy Check (CRC), and the limitations of this scheme. This particular version of the Hamming Code guarantees single-bit error corrections as well as double-bit error detections. Furthermore, this version of Hamming Code has proved to be fast with a checking time of 5.669 nanoseconds, that has a relatively higher code rate and lower bit overhead compared to the other versions and can detect a greater percentage of errors per length of code than other EDAC schemes with similar capabilities. In conclusion, with the proper implementation of the system, it is quite possible to ensure a relatively uncorrupted satellite storage system.Keywords: bit-flips, Hamming code, low earth orbit, parity bits, satellite, single error upset
Procedia PDF Downloads 13010263 Iterative Method for Lung Tumor Localization in 4D CT
Authors: Sarah K. Hagi, Majdi Alnowaimi
Abstract:
In the last decade, there were immense advancements in the medical imaging modalities. These advancements can scan a whole volume of the lung organ in high resolution images within a short time. According to this performance, the physicians can clearly identify the complicated anatomical and pathological structures of lung. Therefore, these advancements give large opportunities for more advance of all types of lung cancer treatment available and will increase the survival rate. However, lung cancer is still one of the major causes of death with around 19% of all the cancer patients. Several factors may affect survival rate. One of the serious effects is the breathing process, which can affect the accuracy of diagnosis and lung tumor treatment plan. We have therefore developed a semi automated algorithm to localize the 3D lung tumor positions across all respiratory data during respiratory motion. The algorithm can be divided into two stages. First, a lung tumor segmentation for the first phase of the 4D computed tomography (CT). Lung tumor segmentation is performed using an active contours method. Then, localize the tumor 3D position across all next phases using a 12 degrees of freedom of an affine transformation. Two data set where used in this study, a compute simulate for 4D CT using extended cardiac-torso (XCAT) phantom and 4D CT clinical data sets. The result and error calculation is presented as root mean square error (RMSE). The average error in data sets is 0.94 mm ± 0.36. Finally, evaluation and quantitative comparison of the results with a state-of-the-art registration algorithm was introduced. The results obtained from the proposed localization algorithm show a promising result to localize alung tumor in 4D CT data.Keywords: automated algorithm , computed tomography, lung tumor, tumor localization
Procedia PDF Downloads 60210262 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal
Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert
Abstract:
We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.Keywords: computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving
Procedia PDF Downloads 36710261 Investigation of Delivery of Triple Play Services
Authors: Paramjit Mahey, Monica Sharma, Jasbinder Singh
Abstract:
Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT
Procedia PDF Downloads 54110260 Diagnostic Physiopathology of Osteitis in the Diabetic Foot
Authors: Adaour Mohamed Amine, Bachene Mohamed Sadek, Fortassi Mosaab, Siouda Wafaa
Abstract:
Foot infections are responsible for a significant number of hospitalizations and amputations in diabetic patients. The objective of our study is to analyze and evaluate the management of diabetic foot in a surgical setting. A retrospective study was conducted based on a selected case of suspected diabetic foot infections of osteitis treated at the Mohamed Boudiaf hospital in Medea. The case was reiterated as a therapeutic charge, consisting of treating first the infection of the soft tissues, then the osteitis: biopsy after at least 15 days of cessation of antibiotic therapy. Successful treatment of osteitis was defined at the end of a follow-up period of complete wound healing, lack of bone resection/amputation surgery at the initial bone site during follow-up , Instead, biopsies are prescribed in the treatment of soft tissue infection. The mean duration of treatment for soft tissue infection was 2-3 weeks, the duration of the antibiotic-free window of therapy prior to bone biopsy was 2-4 weeks. This patient received medical management without surgical resection. The success rate for treating osteitis at one year was 73% and healing at one year was 88%.It is often limited to a sausage of the foot at the cost of repeated amputations. The best management remains prevention, which necessarily involves setting up a specialized and adapted centre.Keywords: osteitis, antibiotic therapy, bone biopsy, diabetic foot
Procedia PDF Downloads 10310259 Correlation between Cephalometric Measurements and Visual Perception of Facial Profile in Skeletal Type II Patients
Authors: Choki, Supatchai Boonpratham, Suwannee Luppanapornlarp
Abstract:
The objective of this study was to find a correlation between cephalometric measurements and visual perception of facial profile in skeletal type II patients. In this study, 250 lateral cephalograms of female patients from age, 20 to 22 years were analyzed. The profile outlines of all the samples were hand traced and transformed into silhouettes by the principal investigator. Profile ratings were done by 9 orthodontists on Visual Analogue Scale from score one to ten (increasing level of convexity). 37 hard issue and soft tissue cephalometric measurements were analyzed by the principal investigator. All the measurements were repeated after 2 weeks interval for error assessment. At last, the rankings of visual perceptions were correlated with cephalometric measurements using Spearman correlation coefficient (P < 0.05). The results show that the increase in facial convexity was correlated with higher values of ANB (A point, nasion and B point), AF-BF (distance from A point to B point in mm), L1-NB (distance from lower incisor to NB line in mm), anterior maxillary alveolar height, posterior maxillary alveolar height, overjet, H angle hard tissue, H angle soft tissue and lower lip to E plane (absolute correlation values from 0.277 to 0.711). In contrast, the increase in facial convexity was correlated with lower values of Pg. to N perpendicular and Pg. to NB (mm) (absolute correlation value -0.302 and -0.294 respectively). From the soft tissue measurements, H angles had a higher correlation with visual perception than facial contour angle, nasolabial angle, and lower lip to E plane. In conclusion, the findings of this study indicated that the correlation of cephalometric measurements with visual perception was less than expected. Only 29% of cephalometric measurements had a significant correlation with visual perception. Therefore, diagnosis based solely on cephalometric analysis can result in failure to meet the patient’s esthetic expectation.Keywords: cephalometric measurements, facial profile, skeletal type II, visual perception
Procedia PDF Downloads 13810258 Permissible Horizontal Displacements during the Construction of Vertical Shafts in Soft Soils at the Valley of Mexico: Case History
Authors: Joel M. De La Rosa R.
Abstract:
In this paper, the results obtained when monitoring the horizontal deformations of the soil mass are detailed, during each of the construction stages of several vertical shafts located in the soft soils of the Valley of Mexico, by means of the flotation method. From the analysis of these results, the magnitude and percentage relationship with respect to the diameter and depth of excavation of the horizontal deformations that occurred during the monitoring period is established. Based on the horizontal deformation monitoring system and the information provided by the supervisor's site log, the construction stages that have the greatest impact on deformations are established. Additionally, an analysis of the deformations is carried out, which takes into account the resistance and deformability characteristics of the excavated soils, as well as the prevailing hydraulic conditions. This work will allow construction engineers and institutions in charge of infrastructure works in the Valley of Mexico to establish permissible ranges for horizontal deformations that can occur in very soft and saturated soils, during the different construction stages; improving response protocols to potentially dangerous behaviors.Keywords: vertical shaft, flotation method, very soft clays, construction supervision
Procedia PDF Downloads 18810257 Performance Analysis of MIMO-OFDM Using Convolution Codes with QAM Modulation
Authors: I Gede Puja Astawa, Yoedy Moegiharto, Ahmad Zainudin, Imam Dui Agus Salim, Nur Annisa Anggraeni
Abstract:
Performance of Orthogonal Frequency Division Multiplexing (OFDM) system can be improved by adding channel coding (error correction code) to detect and correct the errors that occur during data transmission. One can use the convolution code. This paper presents performance of OFDM using Space Time Block Codes (STBC) diversity technique use QAM modulation with code rate 1/2. The evaluation is done by analyzing the value of Bit Error Rate (BER) vs. Energy per Bit to Noise Power Spectral Density Ratio (Eb/No). This scheme is conducted 256 sub-carrier which transmits Rayleigh multipath channel in OFDM system. To achieve a BER of 10-3 is required 30 dB SNR in SISO-OFDM scheme. For 2x2 MIMO-OFDM scheme requires 10 dB to achieve a BER of 10-3. For 4x4 MIMO-OFDM scheme requires 5 dB while adding convolution in a 4x4 MIMO-OFDM can improve performance up to 0 dB to achieve the same BER. This proves the existence of saving power by 3 dB of 4x4 MIMO-OFDM system without coding, power saving 7 dB of 2x2 MIMO-OFDM system without coding and significant power savings from SISO-OFDM system.Keywords: convolution code, OFDM, MIMO, QAM, BER
Procedia PDF Downloads 38810256 Field-Programmable Gate Array-Based Baseband Signals Generator of X-Band Transmitter for Micro Satellite/CubeSat
Authors: Shih-Ming Wang, Chun-Kai Yeh, Ming-Hwang Shie, Tai-Wei Lin, Chieh-Fu Chang
Abstract:
This paper introduces a FPGA-based baseband signals generator (BSG) of X-band transmitter developed by National Space Organization (NSPO), Taiwan, for earth observation. In order to gain more flexibility for various applications, a number of modulation schemes, QPSK, DeQPSK and 8PSK 4D-TCM are included. For micro satellite scenario, the maximum symbol rate is up to 150Mbsps, and the EVM is as low as 1.9%. For CubeSat scenario, the maximum symbol rate is up to 60Mbsps, and the EVM is less than 1.7%. The maximum data rates are 412.5Mbps and 165Mbps, respectively. Besides, triple modular redundancy (TMR) scheme is implemented in order to reduce single event effect (SEE) induced by radiation. Finally, the theoretical error performance is provided based on comprehensive analysis, especially when BER is lower and much lower than 10⁻⁶ due to low error bit requirement of modern high-resolution earth remote-sensing instruments.Keywords: X-band transmitter, FPGA (Field-Programmable Gate Array), CubeSat, micro satellite
Procedia PDF Downloads 29510255 Effects of the Slope Embankment Variation on Influence Areas That Causes the Differential Settlement around of Embankment
Authors: Safitri W. Nur, Prathisto Panuntun L. Unggul, M. Ivan Adi Perdana, R. Dary Wira Mahadika
Abstract:
On soft soil areas, high embankment as a preloading needed to improve the bearing capacity of the soil. For sustainable development, the construction of embankment must not disturb the area around of them. So, the influence area must be known before the contractor applied their embankment design. For several cases in Indonesia, the area around of embankment construction is housing resident and other building. So that, the influence area must be identified to avoid the differential settlement occurs on the buildings around of them. Differential settlement causes the building crack. Each building has a limited tolerance for the differential settlement. For concrete buildings, the tolerance is 0,002 – 0,003 m and for steel buildings, the tolerance is 0,006 – 0,008 m. If the differential settlement stands on the range of that value, building crack can be avoided. In fact, the settlement around of embankment is assumed as zero. Because of that, so many problems happen when high embankment applied on soft soil area. This research used the superposition method combined with plaxis analysis to know the influences area around of embankment in some location with the differential characteristic of the soft soil. The undisturbed soil samples take on 55 locations with undisturbed soil samples at some soft soils location in Indonesia. Based on this research, it was concluded that the effects of embankment variation are if more gentle the slope, the influence area will be greater and vice versa. The largest of the influence area with h initial embankment equal to 2 - 6 m with slopes 1:1, 1:2, 1:3, 1:4, 1:5, 1:6, 1:7, 1:8 is 32 m from the edge of the embankment.Keywords: differential settlement, embankment, influence area, slope, soft soil
Procedia PDF Downloads 40810254 Forecasting Exchange Rate between Thai Baht and the US Dollar Using Time Series Analysis
Authors: Kunya Bowornchockchai
Abstract:
The objective of this research is to forecast the monthly exchange rate between Thai baht and the US dollar and to compare two forecasting methods. The methods are Box-Jenkins’ method and Holt’s method. Results show that the Box-Jenkins’ method is the most suitable method for the monthly Exchange Rate between Thai Baht and the US Dollar. The suitable forecasting model is ARIMA (1,1,0) without constant and the forecasting equation is Yt = Yt-1 + 0.3691 (Yt-1 - Yt-2) When Yt is the time series data at time t, respectively.Keywords: Box–Jenkins method, Holt’s method, mean absolute percentage error (MAPE), exchange rate
Procedia PDF Downloads 25410253 The Relationship Between Hourly Compensation and Unemployment Rate Using the Panel Data Regression Analysis
Authors: S. K. Ashiquer Rahman
Abstract:
the paper concentrations on the importance of hourly compensation, emphasizing the significance of the unemployment rate. There are the two most important factors of a nation these are its unemployment rate and hourly compensation. These are not merely statistics but they have profound effects on individual, families, and the economy. They are inversely related to one another. When we consider the unemployment rate that will probably decline as hourly compensations in manufacturing rise. But when we reduced the unemployment rates and increased job prospects could result from higher compensation. That’s why, the increased hourly compensation in the manufacturing sector that could have a favorable effect on job changing issues. Moreover, the relationship between hourly compensation and unemployment is complex and influenced by broader economic factors. In this paper, we use panel data regression models to evaluate the expected link between hourly compensation and unemployment rate in order to determine the effect of hourly compensation on unemployment rate. We estimate the fixed effects model, evaluate the error components, and determine which model (the FEM or ECM) is better by pooling all 60 observations. We then analysis and review the data by comparing 3 several countries (United States, Canada and the United Kingdom) using panel data regression models. Finally, we provide result, analysis and a summary of the extensive research on how the hourly compensation effects on the unemployment rate. Additionally, this paper offers relevant and useful informational to help the government and academic community use an econometrics and social approach to lessen on the effect of the hourly compensation on Unemployment rate to eliminate the problem.Keywords: hourly compensation, Unemployment rate, panel data regression models, dummy variables, random effects model, fixed effects model, the linear regression model
Procedia PDF Downloads 8110252 Of an 80 Gbps Passive Optical Network Using Time and Wavelength Division Multiplexing
Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Faizan Khan, Xiaodong Yang
Abstract:
Internet Service Providers are driving endless demands for higher bandwidth and data throughput as new services and applications require higher bandwidth. Users want immediate and accurate data delivery. This article focuses on converting old conventional networks into passive optical networks based on time division and wavelength division multiplexing. The main focus of this research is to use a hybrid of time-division multiplexing and wavelength-division multiplexing to improve network efficiency and performance. In this paper, we design an 80 Gbps Passive Optical Network (PON), which meets the need of the Next Generation PON Stage 2 (NGPON2) proposed in this paper. The hybrid of the Time and Wavelength division multiplexing (TWDM) is said to be the best solution for the implementation of NGPON2, according to Full-Service Access Network (FSAN). To co-exist with or replace the current PON technologies, many wavelengths of the TWDM can be implemented simultaneously. By utilizing 8 pairs of wavelengths that are multiplexed and then transmitted over optical fiber for 40 Kms and on the receiving side, they are distributed among 256 users, which shows that the solution is reliable for implementation with an acceptable data rate. From the results, it can be concluded that the overall performance, Quality Factor, and bandwidth of the network are increased, and the Bit Error rate is minimized by the integration of this approach.Keywords: bit error rate, fiber to the home, passive optical network, time and wavelength division multiplexing
Procedia PDF Downloads 7010251 Error Probability of Multi-User Detection Techniques
Authors: Komal Babbar
Abstract:
Multiuser Detection is the intelligent estimation/demodulation of transmitted bits in the presence of Multiple Access Interference. The authors have presented the Bit-error rate (BER) achieved by linear multi-user detectors: Matched filter (which treats the MAI as AWGN), Decorrelating and MMSE. In this work, authors investigate the bit error probability analysis for Matched filter, decorrelating, and MMSE. This problem arises in several practical CDMA applications where the receiver may not have full knowledge of the number of active users and their signature sequences. In particular, the behavior of MAI at the output of the Multi-user detectors (MUD) is examined under various asymptotic conditions including large signal to noise ratio; large near-far ratios; and a large number of users. In the last section Authors also shows Matlab Simulation results for Multiuser detection techniques i.e., Matched filter, Decorrelating, MMSE for 2 users and 10 users.Keywords: code division multiple access, decorrelating, matched filter, minimum mean square detection (MMSE) detection, multiple access interference (MAI), multiuser detection (MUD)
Procedia PDF Downloads 52810250 Study and Analysis of the Factors Affecting Road Safety Using Decision Tree Algorithms
Authors: Naina Mahajan, Bikram Pal Kaur
Abstract:
The purpose of traffic accident analysis is to find the possible causes of an accident. Road accidents cannot be totally prevented but by suitable traffic engineering and management the accident rate can be reduced to a certain extent. This paper discusses the classification techniques C4.5 and ID3 using the WEKA Data mining tool. These techniques use on the NH (National highway) dataset. With the C4.5 and ID3 technique it gives best results and high accuracy with less computation time and error rate.Keywords: C4.5, ID3, NH(National highway), WEKA data mining tool
Procedia PDF Downloads 33810249 Performance Evaluation of MIMO-OFDM Communication Systems
Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany
Abstract:
This paper evaluates the bit error rate (BER) performance of MIMO-OFDM communication system. MIMO system uses multiple transmitting and receiving antennas with different coding techniques to either enhance the transmission diversity or spatial multiplexing gain. Utilizing alamouti algorithm were the same information transmitted over multiple antennas at different time intervals and then collected again at the receivers to minimize the probability of error, combat fading and thus improve the received signal to noise ratio. While utilizing V-BLAST algorithm, the transmitted signals are divided into different transmitting channels and transferred over the channel to be received by different receiving antennas to increase the transmitted data rate and achieve higher throughput. The paper provides a study of different diversity gain coding schemes and spatial multiplexing coding for MIMO systems. A comparison of various channels' estimation and equalization techniques are given. The simulation is implemented using MATLAB, and the results had shown the performance of transmission models under different channel environments.Keywords: MIMO communication, BER, space codes, channels, alamouti, V-BLAST
Procedia PDF Downloads 17510248 The Prevalence of Intubation Induced Dental Complications among Hospitalized Patients
Authors: Dorsa Rahi, Arghavan Tonkanbonbi, Soheila Manifar, Behzad Jafvarnejad
Abstract:
Background and Aim: Intraoral manipulation is performed during endotracheal intubation for general anesthesia, which can traumatize the soft and hard tissue in the oral cavity and cause postoperative pain and discomfort. Dental trauma is the most common complication of intubation. This study aimed to assess the prevalence of dental complications due to intubation in patients hospitalized in Imam Khomeini Hospital during 2018-2019. Materials and Methods: A total of 805 patients presenting to the Cancer Institute of Imam Khomeini Hospital for preoperative anesthesia consultation were randomly enrolled. A dentist interviewed the patients and performed a comprehensive clinical oral examination preoperatively. The patients underwent clinical oral examination by another dentist postoperatively. Results: No significant correlation was found between dental trauma (tooth fracture, tooth mobility, or soft tissue injury) after intubation with the age or gender of patients. According to the Wilcoxon test and McNemar-Bowker Test, the rate of mobility before the intubation was significantly different from that after the intubation (P=0.000). Maxillary central incisors, maxillary left canine and mandibular right and left central incisors had the highest rate of fracture. Conclusion: Mobile teeth before the intubation are at higher risk of avulsion and aspiration during the procedure. Patients with primary temporomandibular joint disorders are more susceptible to post-intubation trismus.Keywords: oral trauma, dental trauma, intubation, anesthesia
Procedia PDF Downloads 14810247 A 5-V to 30-V Current-Mode Boost Converter with Integrated Current Sensor and Power-on Protection
Authors: Jun Yu, Yat-Hei Lam, Boris Grinberg, Kevin Chai Tshun Chuan
Abstract:
This paper presents a 5-V to 30-V current-mode boost converter for powering the drive circuit of a micro-electro-mechanical sensor. The design of a transconductance amplifier and an integrated current sensing circuit are presented. In addition, essential building blocks for power-on protection such as a soft-start and clamp block and supply and clock ready block are discussed in details. The chip is fabricated in a 0.18-μm CMOS process. Measurement results show that the soft-start and clamp block can effectively limit the inrush current during startup and protect the boost converter from startup failure.Keywords: boost converter, current sensing, power-on protection, step-up converter, soft-start
Procedia PDF Downloads 101910246 An Improved Cooperative Communication Scheme for IoT System
Authors: Eui-Hak Lee, Jae-Hyun Ro, Hyoung-Kyu Song
Abstract:
In internet of things (IoT) system, the communication scheme with reliability and low power is required to connect a terminal. Cooperative communication can achieve reliability and lower power than multiple-input multiple-output (MIMO) system. Cooperative communication increases the reliability with low power, but decreases a throughput. It has a weak point that the communication throughput is decreased. In this paper, a novel scheme is proposed to increase the communication throughput. The novel scheme is a transmission structure that increases transmission rate. And a decoding scheme according to the novel transmission structure is proposed. Simulation results show that the proposed scheme increases the throughput without bit error rate (BER) performance degradation.Keywords: cooperative communication, IoT, STBC, transmission rate
Procedia PDF Downloads 39610245 A Method for Improving the Embedded Runge Kutta Fehlberg 4(5)
Authors: Sunyoung Bu, Wonkyu Chung, Philsu Kim
Abstract:
In this paper, we introduce a method for improving the embedded Runge-Kutta-Fehlberg 4(5) method. At each integration step, the proposed method is comprised of two equations for the solution and the error, respectively. This solution and error are obtained by solving an initial value problem whose solution has the information of the error at each integration step. The constructed algorithm controls both the error and the time step size simultaneously and possesses a good performance in the computational cost compared to the original method. For the assessment of the effectiveness, EULR problem is numerically solved.Keywords: embedded Runge-Kutta-Fehlberg method, initial value problem, EULR problem, integration step
Procedia PDF Downloads 46310244 Changing Arbitrary Data Transmission Period by Using Bluetooth Module on Gas Sensor Node of Arduino Board
Authors: Hiesik Kim, Yong-Beom Kim, Jaheon Gu
Abstract:
Internet of Things (IoT) applications are widely serviced and spread worldwide. Local wireless data transmission technique must be developed to rate up with some technique. Bluetooth wireless data communication is wireless technique is technique made by Special Inter Group (SIG) using the frequency range 2.4 GHz, and it is exploiting Frequency Hopping to avoid collision with a different device. To implement experiment, equipment for experiment transmitting measured data is made by using Arduino as open source hardware, gas sensor, and Bluetooth module and algorithm controlling transmission rate is demonstrated. Experiment controlling transmission rate also is progressed by developing Android application receiving measured data, and controlling this rate is available at the experiment result. It is important that in the future, improvement for communication algorithm be needed because a few error occurs when data is transferred or received.Keywords: Arduino, Bluetooth, gas sensor, IoT, transmission
Procedia PDF Downloads 27710243 Quantification of Soft Tissue Artefacts Using Motion Capture Data and Ultrasound Depth Measurements
Authors: Azadeh Rouhandeh, Chris Joslin, Zhen Qu, Yuu Ono
Abstract:
The centre of rotation of the hip joint is needed for an accurate simulation of the joint performance in many applications such as pre-operative planning simulation, human gait analysis, and hip joint disorders. In human movement analysis, the hip joint center can be estimated using a functional method based on the relative motion of the femur to pelvis measured using reflective markers attached to the skin surface. The principal source of errors in estimation of hip joint centre location using functional methods is soft tissue artefacts due to the relative motion between the markers and bone. One of the main objectives in human movement analysis is the assessment of soft tissue artefact as the accuracy of functional methods depends upon it. Various studies have described the movement of soft tissue artefact invasively, such as intra-cortical pins, external fixators, percutaneous skeletal trackers, and Roentgen photogrammetry. The goal of this study is to present a non-invasive method to assess the displacements of the markers relative to the underlying bone using optical motion capture data and tissue thickness from ultrasound measurements during flexion, extension, and abduction (all with knee extended) of the hip joint. Results show that the artefact skin marker displacements are non-linear and larger in areas closer to the hip joint. Also marker displacements are dependent on the movement type and relatively larger in abduction movement. The quantification of soft tissue artefacts can be used as a basis for a correction procedure for hip joint kinematics.Keywords: hip joint center, motion capture, soft tissue artefact, ultrasound depth measurement
Procedia PDF Downloads 28110242 Permeability Prediction Based on Hydraulic Flow Unit Identification and Artificial Neural Networks
Authors: Emad A. Mohammed
Abstract:
The concept of hydraulic flow units (HFU) has been used for decades in the petroleum industry to improve the prediction of permeability. This concept is strongly related to the flow zone indicator (FZI) which is a function of the reservoir rock quality index (RQI). Both indices are based on reservoir porosity and permeability of core samples. It is assumed that core samples with similar FZI values belong to the same HFU. Thus, after dividing the porosity-permeability data based on the HFU, transformations can be done in order to estimate the permeability from the porosity. The conventional practice is to use the power law transformation using conventional HFU where percentage of error is considerably high. In this paper, neural network technique is employed as a soft computing transformation method to predict permeability instead of power law method to avoid higher percentage of error. This technique is based on HFU identification where Amaefule et al. (1993) method is utilized. In this regard, Kozeny and Carman (K–C) model, and modified K–C model by Hasan and Hossain (2011) are employed. A comparison is made between the two transformation techniques for the two porosity-permeability models. Results show that the modified K-C model helps in getting better results with lower percentage of error in predicting permeability. The results also show that the use of artificial intelligence techniques give more accurate prediction than power law method. This study was conducted on a heterogeneous complex carbonate reservoir in Oman. Data were collected from seven wells to obtain the permeability correlations for the whole field. The findings of this study will help in getting better estimation of permeability of a complex reservoir.Keywords: permeability, hydraulic flow units, artificial intelligence, correlation
Procedia PDF Downloads 13610241 Determination of Activation Energy for Thermal Decomposition of Selected Soft Tissues Components
Authors: M. Ekiert, T. Uhl, A. Mlyniec
Abstract:
Tendons are the biological soft tissue structures composed of collagen, proteoglycan, glycoproteins, water and cells of extracellular matrix (ECM). Tendons, which primary function is to transfer force generated by the muscles to the bones causing joints movement, are exposed to many micro and macro damages. In fact, tendons and ligaments trauma are one of the most numerous injuries of human musculoskeletal system, causing for many people (particularly for athletes and physically active people), recurring disorders, chronic pain or even inability of movement. The number of tendons reconstruction and transplantation procedures is increasing every year. Therefore, studies on soft tissues storage conditions (influencing i.e. tissue aging) seem to be an extremely important issue. In this study, an atomic-scale investigation on the kinetics of decomposition of two selected tendon components – collagen type I (which forms a 60-85% of a tendon dry mass) and elastin protein (which combine with ECM creates elastic fibers of connective tissues) is presented. A molecular model of collagen and elastin was developed based on crystal structure of triple-helical collagen-like 1QSU peptide and P15502 human elastin protein, respectively. Each model employed 4 linear strands collagen/elastin strands per unit cell, distributed in 2x2 matrix arrangement, placed in simulation box filled with water molecules. A decomposition phenomena was simulated with molecular dynamics (MD) method using ReaxFF force field and periodic boundary conditions. A set of NVT-MD runs was performed for 1000K temperature range in order to obtained temperature-depended rate of production of decomposition by-products. Based on calculated reaction rates activation energies and pre-exponential factors, required to formulate Arrhenius equations describing kinetics of decomposition of tested soft tissue components, were calculated. Moreover, by adjusting a model developed for collagen, system scalability and correct implementation of the periodic boundary conditions were evaluated. An obtained results provide a deeper insight into decomposition of selected tendon components. A developed methodology may also be easily transferred to other connective tissue elements and therefore might be used for further studies on soft tissues aging.Keywords: decomposition, molecular dynamics, soft tissue, tendons
Procedia PDF Downloads 21010240 An Improved Model of Estimation Global Solar Irradiation from in situ Data: Case of Oran Algeria Region
Authors: Houcine Naim, Abdelatif Hassini, Noureddine Benabadji, Alex Van Den Bossche
Abstract:
In this paper, two models to estimate the overall monthly average daily radiation on a horizontal surface were applied to the site of Oran (35.38 ° N, 0.37 °W). We present a comparison between the first one is a regression equation of the Angstrom type and the second model is developed by the present authors some modifications were suggested using as input parameters: the astronomical parameters as (latitude, longitude, and altitude) and meteorological parameters as (relative humidity). The comparisons are made using the mean bias error (MBE), root mean square error (RMSE), mean percentage error (MPE), and mean absolute bias error (MABE). This comparison shows that the second model is closer to the experimental values that the model of Angstrom.Keywords: meteorology, global radiation, Angstrom model, Oran
Procedia PDF Downloads 23210239 Reliability of the Estimate of Earthwork Quantity Based on 3D-BIM
Authors: Jaechoul Shin, Juhwan Hwang
Abstract:
In case of applying the BIM method to the civil engineering in the area of free formed structure, we can expect comparatively high rate of construction productivity as it is in the building engineering area. In this research, we developed quantity calculation error applying it to earthwork and bridge construction (e.g. PSC-I type segmental girder bridge amd integrated bridge of steel I-girders and inverted-Tee bent cap), NATM (New Austrian Tunneling Method) tunnel construction, retaining wall construction, culvert construction and implemented BIM based 3D modeling quantity survey. we confirmed high reliability of the BIM-based method in structure work in which errors occurred in range between -6% ~ +5%. Especially, understanding of the problem and improvement of the existing 2D-CAD based of quantity calculation through rock type quantity calculation error in range of -14% ~ +13% of earthwork quantity calculation. It is benefit and applicability of BIM method in civil engineering. In addition, routine method for quantity of earthwork has the same error tolerance negligible for that of structure work. But, rock type's quantity calculated as the error appears significantly to the reliability of 2D-based volume calculation shows that the problem could be. Through the estimating quantity of earthwork based 3D-BIM, proposed method has better reliability than routine method. BIM, as well as the design, construction, maintenance levels of information when you consider the benefits of integration, the introduction of BIM design in civil engineering and the possibility of applying for the effectiveness was confirmed.Keywords: BIM, 3D modeling, 3D-BIM, quantity of earthwork
Procedia PDF Downloads 44210238 Improved Performance Scheme for Joint Transmission in Downlink Coordinated Multi-Point Transmission
Authors: Young-Su Ryu, Su-Hyun Jung, Myoung-Jin Kim, Hyoung-Kyu Song
Abstract:
In this paper, improved performance scheme for joint transmission is proposed in downlink (DL) coordinated multi-point(CoMP) in case of constraint transmission power. This scheme is that serving transmission point (TP) request a joint transmission to inter-TP and selects one pre-coding technique according to channel state information(CSI) from user equipment(UE). The simulation results show that the bit error rate(BER) and throughput performances of the proposed scheme provide high spectral efficiency and reliable data at the cell edge.Keywords: CoMP, joint transmission, minimum mean square error, zero-forcing, zero-forcing dirty paper coding
Procedia PDF Downloads 55310237 Employees and Their Perception of Soft Skills on Their Employability
Authors: Sukrita Mukherjee, Anindita Chaudhuri
Abstract:
Soft skills are a crucial aspect for employees, and these skills are not confined to any particular field rather, it guarantees further career growth and job opportunities for employees who are seeking growth. Soft skills are also regarded as personality-specific skills that are observable and are qualitative in nature, which determines an employee’s strengths as a leader. When an employee intends to hold his job, then the person must make effective use of his personal resources, that, in turn, impacts his employability in a positive manner. An employee at his workplace is expected to make effective use of his personal resources. The resources that are to be used by the employee are generally of two types. First type of resources are occupation related, which is related with the educational background of the employee, and the second type of resources are the psychological resources of the employee, such as self-knowledge, career orientation awareness, sense of purpose and emotional literacy, that are considered crucial for an employee in his workplace. The present study is a qualitative study which includes 10 individuals working in IT Sector and Service Industry, respectively. For IT sector, graduate people are considered, and for the Service Industry, individuals who have done a Professional course in order to get into the industry are considered. The emerging themes from the findings after thematic analysis reveal that different aspect of Soft skills such as communication, decision making, constant learning, keeping oneself updated with the latest technological advancement, emotional intelligence are some of the important factors that helps an employee not only to sustain his job, but also grow in his workplace.Keywords: employabiliy, soft skils, employees, resources, workplace
Procedia PDF Downloads 63