Search results for: vector optimization
2245 Predictive Maintenance of Electrical Induction Motors Using Machine Learning
Authors: Muhammad Bilal, Adil Ahmed
Abstract:
This study proposes an approach for electrical induction motor predictive maintenance utilizing machine learning algorithms. On the basis of a study of temperature data obtained from sensors put on the motor, the goal is to predict motor failures. The proposed models are trained to identify whether a motor is defective or not by utilizing machine learning algorithms like Support Vector Machines (SVM) and K-Nearest Neighbors (KNN). According to a thorough study of the literature, earlier research has used motor current signature analysis (MCSA) and vibration data to forecast motor failures. The temperature signal methodology, which has clear advantages over the conventional MCSA and vibration analysis methods in terms of cost-effectiveness, is the main subject of this research. The acquired results emphasize the applicability and effectiveness of the temperature-based predictive maintenance strategy by demonstrating the successful categorization of defective motors using the suggested machine learning models.Keywords: predictive maintenance, electrical induction motors, machine learning, temperature signal methodology, motor failures
Procedia PDF Downloads 1182244 Training of Future Computer Science Teachers Based on Machine Learning Methods
Authors: Meruert Serik, Nassipzhan Duisegaliyeva, Danara Tleumagambetova
Abstract:
The article highlights and describes the characteristic features of real-time face detection in images and videos using machine learning algorithms. Students of educational programs reviewed the research work "6B01511-Computer Science", "7M01511-Computer Science", "7M01525- STEM Education," and "8D01511-Computer Science" of Eurasian National University named after L.N. Gumilyov. As a result, the advantages and disadvantages of Haar Cascade (Haar Cascade OpenCV), HoG SVM (Histogram of Oriented Gradients, Support Vector Machine), and MMOD CNN Dlib (Max-Margin Object Detection, convolutional neural network) detectors used for face detection were determined. Dlib is a general-purpose cross-platform software library written in the programming language C++. It includes detectors used for determining face detection. The Cascade OpenCV algorithm is efficient for fast face detection. The considered work forms the basis for the development of machine learning methods by future computer science teachers.Keywords: algorithm, artificial intelligence, education, machine learning
Procedia PDF Downloads 732243 Effect of Atmospheric Pressure on the Flow at the Outlet of a Propellant Nozzle
Authors: R. Haoui
Abstract:
The purpose of this work is to simulate the flow at the exit of Vulcan 1 engine of European launcher Ariane 5. The geometry of the propellant nozzle is already determined using the characteristics method. The pressure in the outlet section of the nozzle is less than atmospheric pressure on the ground, causing the existence of oblique and normal shock waves at the exit. During the rise of the launcher, the atmospheric pressure decreases and the shock wave disappears. The code allows the capture of shock wave at exit of nozzle. The numerical technique uses the Flux Vector Splitting method of Van Leer to ensure convergence and avoid the calculation instabilities. The Courant, Friedrichs and Lewy coefficient (CFL) and mesh size level are selected to ensure the numerical convergence. The nonlinear partial derivative equations system which governs this flow is solved by an explicit unsteady numerical scheme by the finite volume method. The accuracy of the solution depends on the size of the mesh and also the step of time used in the discretized equations. We have chosen in this study the mesh that gives us a stationary solution with good accuracy.Keywords: finite volume, lunchers, nozzles, shock wave
Procedia PDF Downloads 2892242 An Evaluation Model for Automatic Map Generalization
Authors: Quynhan Tran, Hong Fan, Quockhanh Pham
Abstract:
Automatic map generalization is a well-known problem in cartography. The development of map generalization research accompanied the development of cartography. The traditional map is plotted manually by cartographic experts. The paper studies none-scale automation generalization of resident polygons and house marker symbol, proposes methodology to evaluate the result maps based on minimal spanning tree. In this paper, the minimal spanning tree before and after map generalization is compared to evaluate whether the generalization result maintain the geographical distribution of features. The minimal spanning tree in vector format is firstly converted into a raster format and the grid size is 2mm (distance on the map). The statistical number of matching grid before and after map generalization and the ratio of overlapping grid to the total grids is calculated. Evaluation experiments are conduct to verify the results. Experiments show that this methodology can give an objective evaluation for the feature distribution and give specialist an hand while they evaluate result maps of none-scale automation generalization with their eyes.Keywords: automatic cartography generalization, evaluation model, geographic feature distribution, minimal spanning tree
Procedia PDF Downloads 6362241 Cointegration Dynamics in Asian Stock Markets: Implications for Long-Term Portfolio Management
Authors: Xinyi Xu
Abstract:
This study conducts a detailed examination of Asian stock markets over the period from 2008 to 2023, with a focus on the dynamics of cointegration and their relevance for long-term investment strategies. Specifically, we assess the co-movement and potential for pairs trading—a strategy where investors take opposing positions on two stocks, indices, or financial instruments that historically move together. For example, we explore the relationship between the Nikkei 225 (N225), Japan’s benchmark stock index, and the Straits Times Index (STI) of Singapore, as well as the relationship between the Korea Composite Stock Price Index (KS11) and the STI. The methodology includes tests for normality, stationarity, cointegration, and the application of Vector Error Correction Modeling (VECM). Our findings reveal significant long-term relationships between these pairs, indicating opportunities for pairs trading strategies. Furthermore, the research underscores the challenges posed by model instability and the influence of major global incidents, which are identified as structural breaks. These findings pave the way for further exploration into the intricacies of financial market dynamics.Keywords: normality tests, stationarity, cointegration, VECM, pairs trading
Procedia PDF Downloads 562240 Kuwait Environmental Remediation Program: Waste Management Data Analytics for Planning and Optimization of Waste Collection
Authors: Aisha Al-Baroud
Abstract:
The United Nations Compensation Commission (UNCC), Kuwait National Focal Point (KNFP) and Kuwait Oil Company (KOC) cooperated in a joint project to undertake comprehensive and collaborative efforts to remediate 26 million m3 of crude oil contaminated soil that had resulted from the Gulf War in 1990/1991. These efforts are referred to as the Kuwait Environmental Remediation Program (KERP). KOC has developed a Total Remediation Solution (TRS) for KERP, which will guide the Remediation projects, comprises of alternative remedial solutions with treatment techniques inclusive of limited landfills for non-treatable soil materials disposal, and relies on treating certain ranges of Total Petroleum Hydrocarbon (TPH) contamination with the most appropriate remediation techniques. The KERP Remediation projects will be implemented within the KOC’s oilfields in North and South East Kuwait. The objectives of this remediation project is to clear land for field development and treat all the oil contaminated features (dry oil lakes, wet oil lakes, and oil contaminated piles) through TRS plan to optimize the treatment processes and minimize the volume of contaminated materials to be placed into landfills. The treatment strategy will comprise of Excavation and Transportation (E&T) of oil contaminated soils from contaminated land to remote treatment areas and to use appropriate remediation technologies or a combination of treatment technologies to achieve remediation target criteria (RTC). KOC has awarded five mega projects to achieve the same and is currently in the execution phase. As a part of the company’s commitment to environment and for the fulfillment of the mandatory HSSEMS procedures, all the Remediation contractors needs to report waste generation data from the various project activities on a monthly basis. Data on waste generation is collected in order to implement cost-efficient and sustainable waste management operations. Data analytics approaches can be built on the top of the data to produce more detailed, and in-time waste generation information for the basis of waste management and collection. The results obtained highlight the potential of advanced data analytic approaches in producing more detailed waste generation information for planning and optimization of waste collection and recycling.Keywords: waste, tencnolgies, KERP, data, soil
Procedia PDF Downloads 1132239 Finite Element Modelling and Optimization of Post-Machining Distortion for Large Aerospace Monolithic Components
Authors: Bin Shi, Mouhab Meshreki, Grégoire Bazin, Helmi Attia
Abstract:
Large monolithic components are widely used in the aerospace industry in order to reduce airplane weight. Milling is an important operation in manufacturing of the monolithic parts. More than 90% of the material could be removed in the milling operation to obtain the final shape. This results in low rigidity and post-machining distortion. The post-machining distortion is the deviation of the final shape from the original design after releasing the clamps. It is a major challenge in machining of the monolithic parts, which costs billions of economic losses every year. Three sources are directly related to the part distortion, including initial residual stresses (RS) generated from previous manufacturing processes, machining-induced RS and thermal load generated during machining. A finite element model was developed to simulate a milling process and predicate the post-machining distortion. In this study, a rolled-aluminum plate AA7175 with a thickness of 60 mm was used for the raw block. The initial residual stress distribution in the block was measured using a layer-removal method. A stress-mapping technique was developed to implement the initial stress distribution into the part. It is demonstrated that this technique significantly accelerates the simulation time. Machining-induced residual stresses on the machined surface were measured using MTS3000 hole-drilling strain-gauge system. The measured RS was applied on the machined surface of a plate to predict the distortion. The predicted distortion was compared with experimental results. It is found that the effect of the machining-induced residual stress on the distortion of a thick plate is very limited. The distortion can be ignored if the wall thickness is larger than a certain value. The RS generated from the thermal load during machining is another important factor causing part distortion. Very limited number of research on this topic was reported in literature. A coupled thermo-mechanical FE model was developed to evaluate the thermal effect on the plastic deformation of a plate. A moving heat source with a feed rate was used to simulate the dynamic cutting heat in a milling process. When the heat source passed the part surface, a small layer was removed to simulate the cutting operation. The results show that for different feed rates and plate thicknesses, the plastic deformation/distortion occurs only if the temperature exceeds a critical level. It was found that the initial residual stress has a major contribution to the part distortion. The machining-induced stress has limited influence on the distortion for thin-wall structure when the wall thickness is larger than a certain value. The thermal load can also generate part distortion when the cutting temperature is above a critical level. The developed numerical model was employed to predict the distortion of a frame part with complex structures. The predictions were compared with the experimental measurements, showing both are in good agreement. Through optimization of the position of the part inside the raw plate using the developed numerical models, the part distortion can be significantly reduced by 50%.Keywords: modelling, monolithic parts, optimization, post-machining distortion, residual stresses
Procedia PDF Downloads 542238 On the Design of Wearable Fractal Antenna
Authors: Amar Partap Singh Pharwaha, Shweta Rani
Abstract:
This paper is aimed at proposing a rhombus shaped wearable fractal antenna for wireless communication systems. The geometrical descriptors of the antenna have been obtained using bacterial foraging optimization (BFO) for wide band operation. The method of moment based IE3D software has been used to simulate the antenna and observed that miniaturization of 13.08% has been achieved without degrading the resonating properties of the proposed antenna. An analysis with different substrates has also been done in order to evaluate the effectiveness of electrical permittivity on the presented structure. The proposed antenna has low profile, light weight and has successfully demonstrated wideband and multiband characteristics for wearable electronic applications.Keywords: BFO, bandwidth, electrical permittivity, fractals, wearable antenna
Procedia PDF Downloads 4632237 GGE-Biplot Analysis of Nano-Titanium Dioxide and Nano-Silica Effects on Sunflower
Authors: Naser Sabaghnia, Mohsen Janmohammadi, Mehdi Mohebodini
Abstract:
Present investigation is performed to evaluate the effects of foliar application of salicylic acid, glycine betaine, ascorbic acid, nano-silica, and nano-titanium dioxide on sunflower. Results showed that the first two principal components were sufficient to create a two-dimensional treatment by trait biplot, and such biplot accounted percentages of 49% and 19%, respectively of the interaction between traits and treatments. The vertex treatments of polygon were ascorbic acid, glycine betaine, nano-TiO2, and control indicated that high performance in some important traits consists of number of days to seed maturity, number of seeds per head, number heads per single plant, hundred seed weight, seed length, seed yield performance, and oil content. Treatments suitable for obtaining the high seed yield were identified in the vector-view function of biplot and displayed nano-silica and nano titanium dioxide as the best treatments suitable for obtaining of high seed yield.Keywords: drought stress, nano-silicon dioxide, oil content, TiO2 nanoparticles
Procedia PDF Downloads 3382236 Numerical Study for Spatial Optimization of DVG for Fin and Tube Heat Exchangers
Authors: Amit Arora, P. M. V. Subbarao, R. S. Agarwal
Abstract:
This study attempts to find promising locations of upwash delta winglets for an inline finned tube heat exchanger. Later, location of winglets that delivers highest improvement in thermal performance is identified. Numerical results clearly showed that optimally located upwash delta winglets not only improved the thermal performance of fin area in tube wake and tubes, but also improved overall thermal performance of heat exchanger.Keywords: apparent friction factor, delta winglet, fin and tube heat exchanger, longitudinal vortices
Procedia PDF Downloads 3102235 Forecasting the Future Implications of ChatGPT Usage in Education Based on AI Algorithms
Authors: Yakubu Bala Mohammed, Nadire Chavus, Mohammed Bulama
Abstract:
Generative Pre-trained Transformer (ChatGPT) represents an artificial intelligence (AI) tool capable of swiftly generating comprehensive responses to prompts and follow-up inquiries. This emerging AI tool was introduced in November 2022 by OpenAI firm, an American AI research laboratory, utilizing substantial language models. This present study aims to delve into the potential future consequences of ChatGPT usage in education using AI-based algorithms. The paper will bring forth the likely potential risks of ChatGBT utilization, such as academic integrity concerns, unfair learning assessments, excessive reliance on AI, and dissemination of inaccurate information using four machine learning algorithms: eXtreme-Gradient Boosting (XGBoost), Support vector machine (SVM), Emotional artificial neural network (EANN), and Random forest (RF) would be used to analyze the study collected data due to their robustness. Finally, the findings of the study will assist education stakeholders in understanding the future implications of ChatGPT usage in education and propose solutions and directions for upcoming studies.Keywords: machine learning, ChatGPT, education, learning, implications
Procedia PDF Downloads 2322234 Modified Form of Margin Based Angular Softmax Loss for Speaker Verification
Authors: Jamshaid ul Rahman, Akhter Ali, Adnan Manzoor
Abstract:
Learning-based systems have received increasing interest in recent years; recognition structures, including end-to-end speak recognition, are one of the hot topics in this area. A famous work on end-to-end speaker verification by using Angular Softmax Loss gained significant importance and is considered useful to directly trains a discriminative model instead of the traditional adopted i-vector approach. The margin-based strategy in angular softmax is beneficial to learn discriminative speaker embeddings where the random selection of margin values is a big issue in additive angular margin and multiplicative angular margin. As a better solution in this matter, we present an alternative approach by introducing a bit similar form of an additive parameter that was originally introduced for face recognition, and it has a capacity to adjust automatically with the corresponding margin values and is applicable to learn more discriminative features than the Softmax. Experiments are conducted on the part of Fisher dataset, where it observed that the additive parameter with angular softmax to train the front-end and probabilistic linear discriminant analysis (PLDA) in the back-end boosts the performance of the structure.Keywords: additive parameter, angular softmax, speaker verification, PLDA
Procedia PDF Downloads 1032233 Durability Analysis of a Knuckle Arm Using VPG System
Authors: Geun-Yeon Kim, S. P. Praveen Kumar, Kwon-Hee Lee
Abstract:
A steering knuckle arm is the component that connects the steering system and suspension system. The structural performances such as stiffness, strength, and durability are considered in its design process. The former study suggested the lightweight design of a knuckle arm considering the structural performances and using the metamodel-based optimization. The six shape design variables were defined, and the optimum design was calculated by applying the kriging interpolation method. The finite element method was utilized to predict the structural responses. The suggested knuckle was made of the aluminum Al6082, and its weight was reduced about 60% in comparison with the base steel knuckle, satisfying the design requirements. Then, we investigated its manufacturability by performing foraging analysis. The forging was done as hot process, and the product was made through two-step forging. As a final step of its developing process, the durability is investigated by using the flexible dynamic analysis software, LS-DYNA and the pre and post processor, eta/VPG. Generally, a car make does not provide all the information with the part manufacturer. Thus, the part manufacturer has a limit in predicting the durability performance with the unit of full car. The eta/VPG has the libraries of suspension, tire, and road, which are commonly used parts. That makes a full car modeling. First, the full car is modeled by referencing the following information; Overall Length: 3,595mm, Overall Width: 1,595mm, CVW (Curve Vehicle Weight): 910kg, Front Suspension: MacPherson Strut, Rear Suspension: Torsion Beam Axle, Tire: 235/65R17. Second, the road is selected as the cobblestone. The road condition of the cobblestone is almost 10 times more severe than that of usual paved road. Third, the dynamic finite element analysis using the LS-DYNA is performed to predict the durability performance of the suggested knuckle arm. The life of the suggested knuckle arm is calculated as 350,000km, which satisfies the design requirement set up by the part manufacturer. In this study, the overall design process of a knuckle arm is suggested, and it can be seen that the developed knuckle arm satisfies the design requirement of the durability with the unit of full car. The VPG analysis is successfully performed even though it does not an exact prediction since the full car model is very rough one. Thus, this approach can be used effectively when the detail to full car is not given.Keywords: knuckle arm, structural optimization, Metamodel, forging, durability, VPG (Virtual Proving Ground)
Procedia PDF Downloads 4192232 Early Outcomes and Lessons from the Implementation of a Geriatric Hip Fracture Protocol at a Level 1 Trauma Center
Authors: Peter Park, Alfonso Ayala, Douglas Saeks, Jordan Miller, Carmen Flores, Karen Nelson
Abstract:
Introduction Hip fractures account for more than 300,000 hospital admissions every year. Many present as fragility fractures in geriatric patients with multiple medical comorbidities. Standardized protocols for the multidisciplinary management of this patient population have been shown to improve patient outcomes. A hip fracture protocol was implemented at a Level I Trauma center with a focus on pre-operative medical optimization and early surgical care. This study evaluates the efficacy of that protocol, including the early transition period. Methods A retrospective review was performed of all patients ages 60 and older with isolated hip fractures who were managed surgically between 2020 and 2022. This included patients 1 year prior and 1 year following the implementation of a hip fracture protocol at a Level I Trauma center. Results 530 patients were identified: 249 patients were treated before, and 281 patients were treated after the protocol was instituted. There was no difference in mean age (p=0.35), gender (p=0.3), or Charlson Comorbidity Index (p=0.38) between the cohorts. Following the implementation of the protocol, there were observed increases in time to surgery (27.5h vs. 33.8h, p=0.01), hospital length of stay (6.3d vs. 9.7d, p<0.001), and ED LOS (5.1h vs. 6.2h, p<0.001). There were no differences in in-hospital mortality (2.01% pre vs. 3.20% post, p=0.39) and complication rates (25% pre vs 26% post, p=0.76). A trend towards improved outcomes was seen after the early transition period but failed to yield statistical significance. Conclusion Early medical management and surgical intervention are key determining factors affecting outcomes following fragility hip fractures. The implementation of a hip fracture protocol at this institution has not yet significantly affected these parameters. This could in part be due to the restrictions placed at this institution during the COVID-19 pandemic. Despite this, the time to OR pre-and post-implementation was quicker than figures reported elsewhere in literature. Further longitudinal data will be collected to determine the final influence of this protocol. Significance/Clinical Relevance Given the increasing number of elderly people and the high morbidity and mortality associated with hip fractures in this population finding cost effective ways to improve outcomes in the management of these injuries has the potential to have enormous positive impact for both patients and hospital systems.Keywords: hip fracture, geriatric, treatment algorithm, preoperative optimization
Procedia PDF Downloads 792231 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems
Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan
Abstract:
Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.Keywords: hybrid storage system, data mining, recurrent neural network, support vector machine
Procedia PDF Downloads 3082230 Biosynthesis of L-Xylose from Xylitol Using a Dual Enzyme Cascade in Escherichia coli
Authors: Mesfin Angaw Tesfay
Abstract:
L-xylose is an important intermediate in the pharmaceutical industry, playing a key role in the production of various antiviral and anticancer drugs. Despite its significance, L-xylose is a rare and costly sugar with limited availability in nature. In recent years, enzymatic production methods have garnered considerable attention due to their benefits over conventional chemical synthesis. In this research, a dual enzyme cascade system was developed to synthesize L-xylose from an inexpensive substrate, xylitol. The study involved cloning and co-expressing two key genes: the L-fucose isomerase (L-fucI) gene from Escherichia coli K-12 and the xylitol-4-dehydrogenase (xdh) gene from Pantoea ananatis ATCC 43072 in Escherichia coli. The resulting recombinant cells, engineered with the PET28a-xdh/L-fucI vector, were able to effectively convert xylitol to L-xylose. The system showed optimal performance at 40°C and a pH of 10.0. Moreover, Zn²⁺ (7.5 mM) enhanced the catalytic activity by 1.34 times. This approach yielded 52.2 g/L of L-xylose from an initial 80 g/L xylitol concentration, with a 65% conversion efficiency and a productivity rate of 1.86. The study highlights a practical method for producing L-xylose from xylitol through a co-expression system carrying the L-fucI and xdh genes.Keywords: l-fucose isomerase, xylitol-4-dehydrogenase, l-xylose, xylitol, co-expression
Procedia PDF Downloads 262229 Pilot Scale Production and Compatibility Criteria of New Self-Cleaning Materials
Authors: Jonjaua Ranogajec, Ognjen Rudic, Snezana Pasalic, Snezana Vucetic, Damir Cjepa
Abstract:
The paper involves a chain of activities from synthesis, establishment of the methodology for characterization and testing of novel protective materials through the pilot production and application on model supports. It summarizes the results regarding the development of the pilot production protocol for newly developed self-cleaning materials. The optimization of the production parameters was completed in order to improve the most important functional properties (mineralogy characteristics, particle size, self-cleaning properties and photocatalytic activity) of the newly designed nanocomposite material.Keywords: pilot production, self-cleaning materials, compatibility, cultural heritage
Procedia PDF Downloads 3952228 Vibration-Based Data-Driven Model for Road Health Monitoring
Authors: Guru Prakash, Revanth Dugalam
Abstract:
A road’s condition often deteriorates due to harsh loading such as overload due to trucks, and severe environmental conditions such as heavy rain, snow load, and cyclic loading. In absence of proper maintenance planning, this results in potholes, wide cracks, bumps, and increased roughness of roads. In this paper, a data-driven model will be developed to detect these damages using vibration and image signals. The key idea of the proposed methodology is that the road anomaly manifests in these signals, which can be detected by training a machine learning algorithm. The use of various machine learning techniques such as the support vector machine and Radom Forest method will be investigated. The proposed model will first be trained and tested with artificially simulated data, and the model architecture will be finalized by comparing the accuracies of various models. Once a model is fixed, the field study will be performed, and data will be collected. The field data will be used to validate the proposed model and to predict the future road’s health condition. The proposed will help to automate the road condition monitoring process, repair cost estimation, and maintenance planning process.Keywords: SVM, data-driven, road health monitoring, pot-hole
Procedia PDF Downloads 862227 Statistical Analysis of the Impact of Maritime Transport Gross Domestic Product (GDP) on Nigeria’s Economy
Authors: Kehinde Peter Oyeduntan, Kayode Oshinubi
Abstract:
Nigeria is referred as the ‘Giant of Africa’ due to high population, land mass and large economy. However, it still trails far behind many smaller economies in the continent in terms of maritime operations. As we have seen that the maritime industry is the spark plug for national growth, because it houses the most crucial infrastructure that generates wealth for a nation, it is worrisome that a nation with six seaports lag in maritime activities. In this research, we have studied how the Gross Domestic Product (GDP) of the maritime transport influences the Nigerian economy. To do this, we applied Simple Linear Regression (SLR), Support Vector Machine (SVM), Polynomial Regression Model (PRM), Generalized Additive Model (GAM) and Generalized Linear Mixed Model (GLMM) to model the relationship between the nation’s Total GDP (TGDP) and the Maritime Transport GDP (MGDP) using a time series data of 20 years. The result showed that the MGDP is statistically significant to the Nigerian economy. Amongst the statistical tool applied, the PRM of order 4 describes the relationship better when compared to other methods. The recommendations presented in this study will guide policy makers and help improve the economy of Nigeria in terms of its GDP.Keywords: maritime transport, economy, GDP, regression, port
Procedia PDF Downloads 1542226 Forensic Detection of Errors Permitted by the Witnesses in Their Testimony
Authors: Lev Bertovsky
Abstract:
The purpose of this study was to determine the reasons for the formation of false testimony from witnesses and make recommendations on the recognition of such cases. During the studies, which were based on the achievements of professionals in the field of psychology, as well as personal investigative practice, the stages of perception of the information were studied, as well as the process of its reclaim from the memory and transmission to the communicator upon request. Based on the principles of the human brain, kinds of conscientious witness mistakes were systematized. Proposals were formulated for the optimization of investigative actions in cases where the witnesses make an honest mistake with respect to the effects previously observed by them.Keywords: criminology, eyewitness testimony, honest mistake, information, investigator, investigation, questioning
Procedia PDF Downloads 1852225 Isolation and Classification of Red Blood Cells in Anemic Microscopic Images
Authors: Jameela Ali Alkrimi, Abdul Rahim Ahmad, Azizah Suliman, Loay E. George
Abstract:
Red blood cells (RBCs) are among the most commonly and intensively studied type of blood cells in cell biology. The lack of RBCs is a condition characterized by lower than normal hemoglobin level; this condition is referred to as 'anemia'. In this study, a software was developed to isolate RBCs by using a machine learning approach to classify anemic RBCs in microscopic images. Several features of RBCs were extracted using image processing algorithms, including principal component analysis (PCA). With the proposed method, RBCs were isolated in 34 second from an image containing 18 to 27 cells. We also proposed that PCA could be performed to increase the speed and efficiency of classification. Our classifier algorithm yielded accuracy rates of 100%, 99.99%, and 96.50% for K-nearest neighbor (K-NN) algorithm, support vector machine (SVM), and neural network ANN, respectively. Classification was evaluated in highly sensitivity, specificity, and kappa statistical parameters. In conclusion, the classification results were obtained for a short time period with more efficient when PCA was used.Keywords: red blood cells, pre-processing image algorithms, classification algorithms, principal component analysis PCA, confusion matrix, kappa statistical parameters, ROC
Procedia PDF Downloads 4052224 An Integrated Approach to the Carbonate Reservoir Modeling: Case Study of the Eastern Siberia Field
Authors: Yana Snegireva
Abstract:
Carbonate reservoirs are known for their heterogeneity, resulting from various geological processes such as diagenesis and fracturing. These complexities may cause great challenges in understanding fluid flow behavior and predicting the production performance of naturally fractured reservoirs. The investigation of carbonate reservoirs is crucial, as many petroleum reservoirs are naturally fractured, which can be difficult due to the complexity of their fracture networks. This can lead to geological uncertainties, which are important for global petroleum reserves. The problem outlines the key challenges in carbonate reservoir modeling, including the accurate representation of fractures and their connectivity, as well as capturing the impact of fractures on fluid flow and production. Traditional reservoir modeling techniques often oversimplify fracture networks, leading to inaccurate predictions. Therefore, there is a need for a modern approach that can capture the complexities of carbonate reservoirs and provide reliable predictions for effective reservoir management and production optimization. The modern approach to carbonate reservoir modeling involves the utilization of the hybrid fracture modeling approach, including the discrete fracture network (DFN) method and implicit fracture network, which offer enhanced accuracy and reliability in characterizing complex fracture systems within these reservoirs. This study focuses on the application of the hybrid method in the Nepsko-Botuobinskaya anticline of the Eastern Siberia field, aiming to prove the appropriateness of this method in these geological conditions. The DFN method is adopted to model the fracture network within the carbonate reservoir. This method considers fractures as discrete entities, capturing their geometry, orientation, and connectivity. But the method has significant disadvantages since the number of fractures in the field can be very high. Due to limitations in the amount of main memory, it is very difficult to represent these fractures explicitly. By integrating data from image logs (formation micro imager), core data, and fracture density logs, a discrete fracture network (DFN) model can be constructed to represent fracture characteristics for hydraulically relevant fractures. The results obtained from the DFN modeling approaches provide valuable insights into the East Siberia field's carbonate reservoir behavior. The DFN model accurately captures the fracture system, allowing for a better understanding of fluid flow pathways, connectivity, and potential production zones. The analysis of simulation results enables the identification of zones of increased fracturing and optimization opportunities for reservoir development with the potential application of enhanced oil recovery techniques, which were considered in further simulations on the dual porosity and dual permeability models. This approach considers fractures as separate, interconnected flow paths within the reservoir matrix, allowing for the characterization of dual-porosity media. The case study of the East Siberia field demonstrates the effectiveness of the hybrid model method in accurately representing fracture systems and predicting reservoir behavior. The findings from this study contribute to improved reservoir management and production optimization in carbonate reservoirs with the use of enhanced and improved oil recovery methods.Keywords: carbonate reservoir, discrete fracture network, fracture modeling, dual porosity, enhanced oil recovery, implicit fracture model, hybrid fracture model
Procedia PDF Downloads 752223 Assessing the Role of Human Mobility on Malaria Transmission in South Sudan
Authors: A. Y. Mukhtar, J. B. Munyakazi, R. Ouifki
Abstract:
Over the past few decades, the unprecedented increase in mobility has raised considerable concern about the relationship between mobility and vector-borne diseases and malaria in particular. Thus, one can claim that human mobility is one of the contributing factors to the resurgence of malaria. To assess human mobility on malaria burden among hosts, we formulate a movement-based model on a network of patches. We then extend human multi-group SEIAR deterministic epidemic models into a system of stochastic differential equations (SDEs). Our quantitative stochastic model which is expressed in terms of average rates of movement between compartments is fitted to time-series data (weekly malaria data of 2011 for each patch) using the maximum likelihood approach. Using the metapopulation (multi-group) model, we compute and analyze the basic reproduction number. The result shows that human movement is sufficient to preserve malaria disease firmness in the patches with the low transmission. With these results, we concluded that the sensitivity of malaria to the human mobility is turning to be greatly important over the implications of future malaria control in South Sudan.Keywords: basic reproduction number, malaria, maximum likelihood, movement, stochastic model
Procedia PDF Downloads 1342222 The Survey of Phlebotomine Sandfly (Diptera: Psychodidae) of Al-Asaba Area in the Northwest Region of the Libya
Authors: Asherf El-Abaied, Elsadik Anan, Badereddin Annajar, Mustafa Saieh, Abudalnaser El-Buni
Abstract:
Zoonotic Cutaneous Leishmaniasis (ZCL) has been endemic in the Northwestern region of Libya for over nine decades. Survey of sandfly fauna in the region revealed that 13 species have been recorded with various distribution and abundance patterns. Phlebotomus papatasi proved to be the main vector of the disease in many areas. To identify sandfly species present in the Al-Asaba town and determine their spatial and seasonal abundance. An epidemiological analysis of the data obtained from the recorded cases was also carried out. Sand flies collected from various sites using sticky traps and CDC miniature light traps during the period from March-November 2006. Recorded ZCL cases were collected from the local Primary Health Care Department and analysed using SPSS statistical package. Ten species of sandflies were identified, seven belong to the genus Phlebotomus and three belong to the genus Sergentomyia. P. papatasi was the most abundant species with peak season recorded in September. The prevalence of the disease was low however; notable increase of ZCL cases in last three years has been indicated.Keywords: Cutaneous leishmaniasis, Phlebotomus papatasi, sandfly fauna, Libya
Procedia PDF Downloads 3022221 From Theory to Practice: Harnessing Mathematical and Statistical Sciences in Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid growth of data in diverse domains has created an urgent need for effective utilization of mathematical and statistical sciences in data analytics. This abstract explores the journey from theory to practice, emphasizing the importance of harnessing mathematical and statistical innovations to unlock the full potential of data analytics. Drawing on a comprehensive review of existing literature and research, this study investigates the fundamental theories and principles underpinning mathematical and statistical sciences in the context of data analytics. It delves into key mathematical concepts such as optimization, probability theory, statistical modeling, and machine learning algorithms, highlighting their significance in analyzing and extracting insights from complex datasets. Moreover, this abstract sheds light on the practical applications of mathematical and statistical sciences in real-world data analytics scenarios. Through case studies and examples, it showcases how mathematical and statistical innovations are being applied to tackle challenges in various fields such as finance, healthcare, marketing, and social sciences. These applications demonstrate the transformative power of mathematical and statistical sciences in data-driven decision-making. The abstract also emphasizes the importance of interdisciplinary collaboration, as it recognizes the synergy between mathematical and statistical sciences and other domains such as computer science, information technology, and domain-specific knowledge. Collaborative efforts enable the development of innovative methodologies and tools that bridge the gap between theory and practice, ultimately enhancing the effectiveness of data analytics. Furthermore, ethical considerations surrounding data analytics, including privacy, bias, and fairness, are addressed within the abstract. It underscores the need for responsible and transparent practices in data analytics, and highlights the role of mathematical and statistical sciences in ensuring ethical data handling and analysis. In conclusion, this abstract highlights the journey from theory to practice in harnessing mathematical and statistical sciences in data analytics. It showcases the practical applications of these sciences, the importance of interdisciplinary collaboration, and the need for ethical considerations. By bridging the gap between theory and practice, mathematical and statistical sciences contribute to unlocking the full potential of data analytics, empowering organizations and decision-makers with valuable insights for informed decision-making.Keywords: data analytics, mathematical sciences, optimization, machine learning, interdisciplinary collaboration, practical applications
Procedia PDF Downloads 932220 Social Media Mining with R. Twitter Analyses
Authors: Diana Codat
Abstract:
Tweets' analysis is part of text mining. Each document is a written text. It's possible to apply the usual text search techniques, in particular by switching to the bag-of-words representation. But the tweets induce peculiarities. Some may enrich the analysis. Thus, their length is calibrated (at least as far as public messages are concerned), special characters make it possible to identify authors (@) and themes (#), the tweet and retweet mechanisms make it possible to follow the diffusion of the information. Conversely, other characteristics may disrupt the analyzes. Because space is limited, authors often use abbreviations, emoticons to express feelings, and they do not pay much attention to spelling. All this creates noise that can complicate the task. The tweets carry a lot of potentially interesting information. Their exploitation is one of the main axes of the analysis of the social networks. We show how to access Twitter-related messages. We will initiate a study of the properties of the tweets, and we will follow up on the exploitation of the content of the messages. We will work under R with the package 'twitteR'. The study of tweets is a strong focus of analysis of social networks because Twitter has become an important vector of communication. This example shows that it is easy to initiate an analysis from data extracted directly online. The data preparation phase is of great importance.Keywords: data mining, language R, social networks, Twitter
Procedia PDF Downloads 1842219 Use the Null Space to Create Starting Point for Stochastic Programming
Authors: Ghussoun Al-Jeiroudi
Abstract:
Stochastic programming is one of the powerful technique which is used to solve real-life problems. Hence, the data of real-life problems is subject to significant uncertainty. Uncertainty is well studied and modeled by stochastic programming. Each day, problems become bigger and bigger and the need for a tool, which does deal with large scale problems, increase. Interior point method is a perfect tool to solve such problems. Interior point method is widely employed to solve the programs, which arise from stochastic programming. It is an iterative technique, so it is required a starting point. Well design starting point plays an important role in improving the convergence speed. In this paper, we propose a starting point for interior point method for multistage stochastic programming. Usually, the optimal solution of stage k+1 is used as starting point for the stage k. This point has the advantage of being close to the solution of the current program. However, it has a disadvantage; it is not in the feasible region of the current program. So, we suggest to take this point and modifying it. That is by adding to it a vector in the null space of the matrix of the unchanged constraints because the solution will change only in the null space of this matrix.Keywords: interior point methods, stochastic programming, null space, starting points
Procedia PDF Downloads 4182218 Laser Cooling of Internal Degrees of Freedom of Molecules: Cesium Case
Authors: R. Horchani
Abstract:
Optical pumping technique with laser fields combined with photo-association of ultra-cold atoms leads to control on demand the vibrational and/or the rotational population of molecules. Here, we review the basic concepts and main steps should be followed, including the excitation schemes and detection techniques we use to achieve the ro-vibrational cooling of Cs2 molecules. We also discuss the extension of this technique to other molecules. In addition, we present a theoretical model used to support the experiment. These simulations can be widely used for the preparation of various experiments since they allow the optimization of several important experimental parameters.Keywords: cold molecule, photo-association, optical pumping, vibrational and rotational cooling
Procedia PDF Downloads 3012217 Optimal Number and Placement of Vertical Links in 3D Network-On-Chip
Authors: Nesrine Toubaline, Djamel Bennouar, Ali Mahdoum
Abstract:
3D technology can lead to a significant reduction in power and average hop-count in Networks on Chip (NoCs). It offers short and fast vertical links which copes with the long wire problem in 2D NoCs. This work proposes heuristic-based method to optimize number and placement of vertical links to achieve specified performance goals. Experiments show that significant improvement can be achieved by using a specific number of vertical interconnect.Keywords: interconnect optimization, monolithic inter-tier vias, network on chip, system on chip, through silicon vias, three dimensional integration circuits
Procedia PDF Downloads 3032216 Traffic Density Measurement by Automatic Detection of the Vehicles Using Gradient Vectors from Aerial Images
Authors: Saman Ghaffarian, Ilgin Gökaşar
Abstract:
This paper presents a new automatic vehicle detection method from very high resolution aerial images to measure traffic density. The proposed method starts by extracting road regions from image using road vector data. Then, the road image is divided into equal sections considering resolution of the images. Gradient vectors of the road image are computed from edge map of the corresponding image. Gradient vectors on the each boundary of the sections are divided where the gradient vectors significantly change their directions. Finally, number of vehicles in each section is carried out by calculating the standard deviation of the gradient vectors in each group and accepting the group as vehicle that has standard deviation above predefined threshold value. The proposed method was tested in four very high resolution aerial images acquired from Istanbul, Turkey which illustrate roads and vehicles with diverse characteristics. The results show the reliability of the proposed method in detecting vehicles by producing 86% overall F1 accuracy value.Keywords: aerial images, intelligent transportation systems, traffic density measurement, vehicle detection
Procedia PDF Downloads 379