Search results for: super hard
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1442

Search results for: super hard

722 The Effectiveness of Cash Flow Management by SMEs in the Mafikeng Local Municipality of South Africa

Authors: Ateba Benedict Belobo, Faan Pelser, Ambe Marcus

Abstract:

Aims: This study arise from repeated complaints from both electronic mails about the underperformance of Mafikeng Small and Medium-Size enterprises after the global financial crisis. The authors were on the view that, this poor performance experienced could be as a result of the negative effects on the cash flow of these businesses due to volatilities in the business environment in general prior to the global crisis. Thus, the paper was mainly aimed at determining the shortcomings experienced by these SMEs with regards to cash flow management. It was also aimed at suggesting possible measures to improve cash flow management of these SMEs in this tough time. Methods: A case study was conducted on 3 beverage suppliers, 27 bottle stores, 3 largest fast consumer goods super markets and 7 automobiles enterprises in the Mafikeng local municipality. A mixed method research design was employed and a purposive sampling was used in selecting SMEs that participated. Views and experiences of participants of the paper were captured through in-depth interviews. Data from the empirical investigation were interpreted using open coding and a simple percentage formula. Results: Findings from the empirical research reflected that majority of Mafikeng SMEs suffer poor operational performance prior to the global financial crisis primarily as a result of poor cash flow management. However, the empirical outcome also indicted other secondary factors contributing to this poor operational performance. Conclusion: Finally, the authorsproposed possible measures that could be used to improve cash flow management and to solve other factors affecting operational performance of SMEs in the Mafikeng local municipality in other to achieve a better business performance.

Keywords: cash flow, business performance, global financial crisis, SMEs

Procedia PDF Downloads 416
721 Swallowing Outcomes in Supraglottic Cancer Patients after Trans-Oral Robotic Surgery (TORS) Provided with Early Dysphagia Management Using Standardized Functional and Objective Measures

Authors: Hitesh Gupta, Surender Dabas

Abstract:

TORS is increasingly gaining widespread use and has been explored as minimally invasive surgery for the treatment of supraglottic cancer (SGC). Being a central critical role of Supraglottis in deglutition, swallowing outcomes post TORS remain a most important factor. Available published studies show inconsistent swallowing outcomes and are deficient in standardized outcome measures, description of swallowing recovery and rehabilitation. So, the objective of this study is to find out swallowing outcomes in SGC patients after TORS provided with early dysphagia management using standardized measures. Prospectively 16 patients were recruited in the study who underwent TORS for primary tumor of Supraglottis, involving one or more sub-sites or invading to sites other than Supraglottis at the BLK Super Specialty Hospital, New Delhi from March 2019 to June 2020. All patients were evaluated for dysphagia with subsequent swallowing rehabilitation on post operative day 3 in the hospital or at the time of discharge, whichever was earlier. Functional oral intake scale (FOIS) and penetration-aspiration score (PAS) were used as outcome measures to quantify swallowing recovery at one month and six month post operatively. Post TORS, patients achieved functional swallow in less than one month, where resection was limited to Supraglottis, while the recovery was delayed in patients with extended resection to tongue base or hypopharynx. Overall, out of Total 16 cases including all supraglottis sub-catagories, 13 (81%) could remove their NG tube (FOIS ≥5 and PAS=1 ) within 6 months. In which 8 cases(62%) achieved functional swallow in less than one month. Swallowing outcomes post TORS supraglottic laryngectomy are favorable if provided with early dysphagia management (or swallowing rehabilitation).

Keywords: dysphagia, supraglottic cancer, swallowing, TORS

Procedia PDF Downloads 93
720 A Genetic Algorithm Approach to Solve a Weaving Job Scheduling Problem, Aiming Tardiness Minimization

Authors: Carolina Silva, João Nuno Oliveira, Rui Sousa, João Paulo Silva

Abstract:

This study uses genetic algorithms to solve a job scheduling problem in a weaving factory. The underline problem regards an NP-Hard problem concerning unrelated parallel machines, with sequence-dependent setup times. This research uses real data regarding a weaving industry located in the North of Portugal, with a capacity of 96 looms and a production, on average, of 440000 meters of fabric per month. Besides, this study includes a high level of complexity once most of the real production constraints are applied, and several real data instances are tested. Topics such as data analyses and algorithm performance are addressed and tested, to offer a solution that can generate reliable and due date results. All the approaches will be tested in the operational environment, and the KPIs monitored, to understand the solution's impact on the production, with a particular focus on the total number of weeks of late deliveries to clients. Thus, the main goal of this research is to develop a solution that allows for the production of automatically optimized production plans, aiming to the tardiness minimizing.

Keywords: genetic algorithms, textile industry, job scheduling, optimization

Procedia PDF Downloads 139
719 Inclusion of Students with Disabilities (SWD) in Higher Education Institutions (HEIs): Self-Advocacy and Engagement as Central

Authors: Tadesse Abera

Abstract:

This study aimed to investigate the contribution of self-advocacy and engagement in the inclusion of SWDs in HEIs. A convergent parallel mixed methods design was employed. This article reports the quantitative strand. A total of 246 SWDs were selected through stratified proportionate random sampling technique from five public HEIs in Ethiopia. Data were collected through Self-advocacy questionnaire, student engagement scale, and college student experience questionnaire and analyzed through frequency, percentage, mean, standard deviation, correlation, one sample t-test and multiple regression. Both self-advocacy and engagement were found to have a predictive power on inclusion of respondents in the HEIs, where engagement was found to be more predictor. From the components of self-advocacy, knowledge of self and leadership and from engagement dimensions sense of belonging, cognitive, and valuing in their respective orders were found to have a stronger predictive power on the inclusion of respondents in the institutions. Based on the findings it was concluded that, if students with disabilities work hard to be self-determined, strive for realizing social justice, exert quality effort and seek active involvement, their inclusion in the institutions would be ensured.

Keywords: self-advocacy, engagement, inclusion, students with disabilities, higher education institution

Procedia PDF Downloads 58
718 Reversal of Testicular Damage and Subfertility by Resveratrol

Authors: Samy S. Eleawa, Mahmoud A. Alkhateeb, Fahaid H. Alhashem, Ismaeel bin-Jaliah, Hussein F. Sakr, Hesham M. Elrefaey, Abbas O. Elkarib, Mohammad A. Haidara, Abdullah S. Shatoor, Mohammad A. Khalil

Abstract:

This effect of Resveratrol (RES) against CdCl2- induced toxicity in the rat testes was investigated. Seven experimental groups of adult male rats were formulated as follows: A) Controls + NS, B) Control+ vehicle (saline solution of hydroxypropyl cyclodextrin), C) RES treated, D) CdCl2 +NS, E) CdCl2+ vehicle, F) RES followed by CdCl2 and M) CdCl2 followed by RES. At the end of the protocol, serum levels of FSH, LH, and testosterone were measured in all groups. Testicular levels of TBARS and Super Oxide Dismutase (SOD) activity were also measured. Epidydidimal semen analysis was performed and testicular expression of Bcl-2, p53 and Bax were assessed by RT-PCR. Also, histopathological changes of testes were examined microscopically and described. Pre and Post administration of RES in cadmium chloride-intoxicated rats improved semen parameters including count, motility, daily sperm production and morphology, increased serum concentrations of gonadotropins and testosterone, decreased testicular lipid peroxidation and increased SOD activity. Not only RES attenuated cadmium chloride induced testicular histopathology but was also able to protect against the onset of cadmium chloride testicular toxicity. Cadmium chloride downregulated the anti-apoptotic gene Bcl2 and upregulated the expression of both pro-apoptotic genes p53 and Bax. Resveratrol protected from and partially reversed cadmium chloride testicular via upregulation of Bcl2 and down regulation of p53 and Bax gene expression. Antioxidant activity of RES protects against cadmium chloride testicular toxicity and partially reverses its effect via upregulation of BCl2 and downregulation of p53 and Bax expression. These findings have far reaching implications on subfertility and impotency frequently seen in hypertensive as well as metabolic syndrome patients.

Keywords: resveratrol, cadmium, infertility, sperm, testis, metabolic syndrome

Procedia PDF Downloads 518
717 Feasibility of Implementing Digital Healthcare Technologies to Prevent Disease: A Mixed-Methods Evaluation of a Digital Intervention Piloted in the National Health Service

Authors: Rosie Cooper, Tracey Chantler, Ellen Pringle, Sadie Bell, Emily Edmundson, Heidi Nielsen, Sheila Roberts, Michael Edelstein, Sandra Mounier Jack

Abstract:

Introduction: In line with the National Health Service’s (NHS) long-term plan, the NHS is looking to implement more digital health interventions. This study explores a case study in this area: a digital intervention used by NHS Trusts in London to consent adolescents for Human Papilloma Virus (HPV) immunisation. Methods: The electronic consent intervention was implemented in 14 secondary schools in inner city, London. These schools were statistically matched with 14 schools from the same area that were consenting using paper forms. Schools were matched on deprivation and English as an additional language. Consent form return rates and HPV vaccine uptake were compared quantitatively between intervention and matched schools. Data from observations of immunisation sessions and school feedback forms were analysed thematically. Individual and group interviews were undertaken with implementers parents and adolescents and a focus group with adolescents were undertaken and analysed thematically. Results: Twenty-eight schools (14 e-consent schools and 14 paper consent schools) comprising 3219 girls (1733 in paper consent schools and 1486 in e-consent schools) were included in the study. The proportion of pupils eligible for free school meals, with English as an additional language and students' ethnicity profile, was similar between the e-consent and paper consent schools. Return of consent forms was not increased by the implementation of the e-consent intervention. There was no difference in the proportion of pupils that were vaccinated at the scheduled vaccination session between the paper (n=14) and e-consent (n=14) schools (80.6% vs. 81.3%, p=0.93). The transition to using the system was not straightforward, whilst schools and staff understood the potential benefits, they found it difficult to adapt to new ways of working which removed some level or control from schools. Part of the reason for lower consent form return in e-consent schools was that some parents found the intervention difficult to use due to limited access to the internet, finding it hard to open the weblink, language barriers, and in some cases, the system closed a few days prior to sessions. Adolescents also highlighted the potential for e-consent interventions to by-pass their information needs. Discussion: We would advise caution against dismissing the e-consent intervention because it did not achieve its goal of increasing the return of consent forms. Given the problems embedding a news service, it was encouraging that HPV vaccine uptake remained stable. Introducing change requires stakeholders to understand, buy in, and work together with others. Schools and staff understood the potential benefits of using e-consent but found the new ways of working removed some level of control from schools, which they found hard to adapt to, possibly suggesting implementing digital technology will require an embedding process. Conclusion: The future direction of the NHS will require implementation of digital technology. Obtaining electronic consent from parents could help streamline school-based adolescent immunisation programmes. Findings from this study suggest that when implementing new digital technologies, it is important to allow for a period of embedding to enable them to become incorporated in everyday practice.

Keywords: consent, digital, immunisation, prevention

Procedia PDF Downloads 123
716 Computational Fluid Dynamics Simulations of Thermal and Flow Fields inside a Desktop Personal Computer Cabin

Authors: Mohammad Salehi, Mohammad Erfan Doraki

Abstract:

In this paper, airflow analysis inside a desktop computer case is performed by simulating computational fluid dynamics. The purpose is to investigate the cooling process of the central processing unit (CPU) with thermal capacities of 80 and 130 watts. The airflow inside the computer enclosure, selected from the microATX model, consists of the main components of heat production such as CPU, hard disk drive, CD drive, floppy drive, memory card and power supply unit; According to the amount of thermal power produced by the CPU with 80 and 130 watts of power, two different geometries have been used for a direct and radial heat sink. First, the independence of the computational mesh and the validation of the solution were performed, and after ensuring the correctness of the numerical solution, the results of the solution were analyzed. The simulation results showed that changes in CPU temperature and other components linearly increased with increasing CPU heat output. Also, the ambient air temperature has a significant effect on the maximum processor temperature.

Keywords: computational fluid dynamics, CPU cooling, computer case simulation, heat sink

Procedia PDF Downloads 104
715 Definition and Core Components of the Role-Partner Allocation Problem in Collaborative Networks

Authors: J. Andrade-Garda, A. Anguera, J. Ares-Casal, M. Hidalgo-Lorenzo, J.-A. Lara, D. Lizcano, S. Suárez-Garaboa

Abstract:

In the current constantly changing economic context, collaborative networks allow partners to undertake projects that would not be possible if attempted by them individually. These projects usually involve the performance of a group of tasks (named roles) that have to be distributed among the partners. Thus, an allocation/matching problem arises that will be referred to as Role-Partner Allocation problem. In real life this situation is addressed by negotiation between partners in order to reach ad hoc agreements. Besides taking a long time and being hard work, both historical evidence and economic analysis show that such approach is not recommended. Instead, the allocation process should be automated by means of a centralized matching scheme. However, as a preliminary step to start the search for such a matching mechanism (or even the development of a new one), the problem and its core components must be specified. To this end, this paper establishes (i) the definition of the problem and its constraints, (ii) the key features of the involved elements (i.e., roles and partners); and (iii) how to create preference lists both for roles and partners. Only this way it will be possible to conduct subsequent methodological research on the solution method.     

Keywords: collaborative network, matching, partner, preference list, role

Procedia PDF Downloads 210
714 Experimental Optimization in Diamond Lapping of Plasma Sprayed Ceramic Coatings

Authors: S. Gowri, K. Narayanasamy, R. Krishnamurthy

Abstract:

Plasma spraying, from the point of value engineering, is considered as a cost-effective technique to deposit high performance ceramic coatings on ferrous substrates for use in the aero,automobile,electronics and semiconductor industries. High-performance ceramics such as Alumina, Zirconia, and titania-based ceramics have become a key part of turbine blades,automotive cylinder liners,microelectronic and semiconductor components due to their ability to insulate and distribute heat. However, as the industries continue to advance, improved methods are needed to increase both the flexibility and speed of ceramic processing in these applications. The ceramics mentioned were individually coated on structural steel substrate with NiCr bond coat of 50-70 micron thickness with the final thickness in the range of 150 to 200 microns. Optimal spray parameters were selected based on bond strength and porosity. The 'optimal' processed specimens were super finished by lapping using diamond and green SiC abrasives. Interesting results could be observed as follows: The green SiC could improve the surface finish of lapped surfaces almost as that by diamond in case of alumina and titania based ceramics but the diamond abrasives could improve the surface finish of PSZ better than that by green SiC. The conventional random scratches could be absent in alumina and titania ceramics but in PS those marks were found to be less. However, the flatness accuracy could be improved unto 60 to 85%. The surface finish and geometrical accuracy were measured and modeled. The abrasives in the midrange of their particle size could improve the surface quality faster and better than the particles of size in low and high ranges. From the experimental investigations after lapping process, the optimal lapping time, abrasive size, lapping pressure etc could be evaluated.

Keywords: atmospheric plasma spraying, ceramics, lapping, surface qulaity, optimization

Procedia PDF Downloads 402
713 A Polynomial Approach for a Graphical-based Integrated Production and Transport Scheduling with Capacity Restrictions

Authors: M. Ndeley

Abstract:

The performance of global manufacturing supply chains depends on the interaction of production and transport processes. Currently, the scheduling of these processes is done separately without considering mutual requirements, which leads to no optimal solutions. An integrated scheduling of both processes enables the improvement of supply chain performance. The integrated production and transport scheduling problem (PTSP) is NP-hard, so that heuristic methods are necessary to efficiently solve large problem instances as in the case of global manufacturing supply chains. This paper presents a heuristic scheduling approach which handles the integration of flexible production processes with intermodal transport, incorporating flexible land transport. The method is based on a graph that allows a reformulation of the PTSP as a shortest path problem for each job, which can be solved in polynomial time. The proposed method is applied to a supply chain scenario with a manufacturing facility in South Africa and shipments of finished product to customers within the Country. The obtained results show that the approach is suitable for the scheduling of large-scale problems and can be flexibly adapted to different scenarios.

Keywords: production and transport scheduling problem, graph based scheduling, integrated scheduling

Procedia PDF Downloads 459
712 South Asia’s Political Landscape: Precipitating Terrorism

Authors: Saroj Kumar Rath

Abstract:

India's Muslims represent 15 percent of the nation's population, the world's third largest group in any nation after Indonesia and Pakistan. Extremist groups like the Islamic State, Al Qaeda, the Taliban and the Haqqani network increasingly view India as a target. Several trends explain the rise: Terrorism threats in South Asia are linked and mobile - if one source is batted down, jihadists relocate to find another Islamic cause. As NATO withdraws from Afghanistan, some jihadists will eye India. Pakistan regards India as a top enemy and some officials even encourage terrorists to target areas like Kashmir or Mumbai. Meanwhile, a stream of Wahhabi preachers have visited India, offering hard-line messages; extremist groups like Al Qaeda and the Islamic State compete for influence, and militants even pay jihadists. Muslims as a minority population in India could offer fertile ground for the extremist recruiters. This paper argues that there is an urgent need for the Indian government to profile militants and examine social media sites to attack Wahhabi indoctrination while supporting education and entrepreneurship for all of India's citizens.

Keywords: Al Qaeda, terrorism, Islamic state, India, haqqani network, Pakistan, Taliban

Procedia PDF Downloads 599
711 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model

Authors: Donatella Giuliani

Abstract:

In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.

Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation

Procedia PDF Downloads 202
710 A Study of the Trade-off Energy Consumption-Performance-Schedulability for DVFS Multicore Systems

Authors: Jalil Boudjadar

Abstract:

Dynamic Voltage and Frequency Scaling (DVFS) multicore platforms are promising execution platforms that enable high computational performance, less energy consumption and flexibility in scheduling the system processes. However, the resulting interleaving and memory interference together with per-core frequency tuning make real-time guarantees hard to be delivered. Besides, energy consumption represents a strong constraint for the deployment of such systems on energy-limited settings. Identifying the system configurations that would achieve a high performance and consume less energy while guaranteeing the system schedulability is a complex task in the design of modern embedded systems. This work studies the trade-off between energy consumption, cores utilization and memory bottleneck and their impact on the schedulability of DVFS multicore time-critical systems with a hierarchy of shared memories. We build a model-based framework using Parametrized Timed Automata of UPPAAL to analyze the mutual impact of performance, energy consumption and schedulability of DVFS multicore systems, and demonstrate the trade-off on an actual case study.

Keywords: time-critical systems, multicore systems, schedulability analysis, energy consumption, performance analysis

Procedia PDF Downloads 89
709 A Neural Network System for Predicting the Hardness of Titanium Aluminum Nitrite (TiAlN) Coatings

Authors: Omar M. Elmabrouk

Abstract:

The cutting tool, in the high-speed machining process, is consistently dealing with high localized stress at the tool tip, tip temperature exceeds 800°C and the chip slides along the rake face. These conditions are affecting the tool wear, the cutting tool performances, the quality of the produced parts and the tool life. Therefore, a thin film coating on the cutting tool should be considered to improve the tool surface properties while maintaining its bulks properties. One of the general coating processes in applying thin film for hard coating purpose is PVD magnetron sputtering. In this paper, the prediction of the effects of PVD magnetron sputtering coating process parameters, sputter power in the range of (4.81-7.19 kW), bias voltage in the range of (50.00-300.00 Volts) and substrate temperature in the range of (281.08-600.00 °C), were studied using artificial neural network (ANN). The results were compared with previously published results using RSM model. It was found that the ANN is more accurate in prediction of tool hardness, and hence, it will not only improve the tool life of the tool but also significantly enhances the efficiency of the machining processes.

Keywords: artificial neural network, hardness, prediction, titanium aluminium nitrate coating

Procedia PDF Downloads 538
708 Public Wi-Fi Security Threat Evil Twin Attack Detection Based on Signal Variant and Hop Count

Authors: Said Abdul Ahad Ahadi, Elyas Baray, Nitin Rakesh, Sudeep Varshney

Abstract:

Wi-Fi is a widely used internet source that is used to provide internet access in many areas such as Stores, Cafes, University campuses, Restaurants and so on. This technology brought more facilities in communication and networking. On the other hand, due to the transmission of data over the air, which makes the network vulnerable, so it becomes prone to various threats such as Evil Twin and etc. The Evil Twin is a kind of adversary which impersonates a legitimate access point (LAP) as it can happen by spoofing the name (SSID) and MAC address (BSSID) of a legitimate access point (LAP). And this attack can cause many threats such as MITM, Service Interruption, Access point service blocking. Various Evil Twin Attack Detection Techniques are proposed, but they require additional hardware, or they require protocol modification. In this paper, we proposed a new technique based on Access Point’s two fingerprints, Received Signal Strength Indicator (RSSI) and Hop Count, that is hard to copy by an adversary. And we implemented the technique in a system called “ETDetector,” which can detect and prevent the attack.

Keywords: evil twin, LAP, SSID, Wi-Fi security, signal variation, ETAD, kali linux, scapy, python

Procedia PDF Downloads 128
707 Optimization of Reliability and Communicability of a Random Two-Dimensional Point Patterns Using Delaunay Triangulation

Authors: Sopheak Sorn, Kwok Yip Szeto

Abstract:

Reliability is one of the important measures of how well the system meets its design objective, and mathematically is the probability that a complex system will perform satisfactorily. When the system is described by a network of N components (nodes) and their L connection (links), the reliability of the system becomes a network design problem that is an NP-hard combinatorial optimization problem. In this paper, we address the network design problem for a random point set’s pattern in two dimensions. We make use of a Voronoi construction with each cell containing exactly one point in the point pattern and compute the reliability of the Voronoi’s dual, i.e. the Delaunay graph. We further investigate the communicability of the Delaunay network. We find that there is a positive correlation and a negative correlation between the homogeneity of a Delaunay's degree distribution with its reliability and its communicability respectively. Based on the correlations, we alter the communicability and the reliability by performing random edge flips, which preserve the number of links and nodes in the network but can increase the communicability in a Delaunay network at the cost of its reliability. This transformation is later used to optimize a Delaunay network with the optimum geometric mean between communicability and reliability. We also discuss the importance of the edge flips in the evolution of real soap froth in two dimensions.

Keywords: Communicability, Delaunay triangulation, Edge Flip, Reliability, Two dimensional network, Voronio

Procedia PDF Downloads 403
706 The Impact of Experiential Learning on the Success of Upper Division Mechanical Engineering Students

Authors: Seyedali Seyedkavoosi, Mohammad Obadat, Seantorrion Boyle

Abstract:

The purpose of this study is to assess the effectiveness of a nontraditional experiential learning strategy in improving the success and interest of mechanical engineering students, using the Kinematics/Dynamics of Machine course as a case study. This upper-division technical course covers a wide range of topics, including mechanism and machine system analysis and synthesis, yet the complexities of ideas like acceleration, motion, and machine component relationships are hard to explain using standard teaching techniques. To solve this problem, a thorough design project was created that gave students hands-on experience developing, manufacturing, and testing their inventions. The main goals of the project were to improve students' grasp of machine design and kinematics, to develop problem-solving and presenting abilities, and to familiarize them with professional software. A questionnaire survey was done to evaluate the effect of this technique on students' performance and interest in mechanical engineering. The outcomes of the study shed light on the usefulness of nontraditional experiential learning approaches in engineering education.

Keywords: experiential learning, nontraditional teaching, hands-on design project, engineering education

Procedia PDF Downloads 74
705 The Effect of Heating-Liquid Nitrogen Cooling on Fracture Toughness of Anisotropic Rock

Authors: A. Kavandi, K. Goshtasbi, M. R. Hadei, H. Nejati

Abstract:

In geothermal energy production, the method of liquid nitrogen (LN₂) fracturing in hot, dry rock is one of the most effective methods to increase the permeability of the reservoir. The geothermal reservoirs mainly consist of hard rocks such as granites and metamorphic rocks like gneiss with high temperatures. Gneiss, as a metamorphic rock, experiences a high level of inherent anisotropy. This type of anisotropy is considered as the nature of rocks, which affects the mechanical behavior of rocks. The aim of this study is to investigate the effects of heating-liquid nitrogen (LN₂) cooling treatment and rock anisotropy on the fracture toughness of gneiss. For this aim, a series of semi-circular bend (SCB) tests were carried out on specimens of gneiss with different anisotropy plane angles (0°, 30°, 60°, and 90°). In this study, gneiss specimens were exposed to heating–cooling treatment through gradual heating to 100°C followed by LN₂ cooling. Results indicate that the fracture toughness of treated samples is lower than that of untreated samples, and with increasing the anisotropy plane angle, the fracture toughness increases. The scanning electron microscope (SEM) technique is also implemented to evaluate the fracture process zone (FPZ) ahead of the crack tip.

Keywords: heating-cooling, anisotropic rock, fracture toughness, liquid nitrogen

Procedia PDF Downloads 46
704 Teaching Continuities in the Great Books Tradition and Contemporary Popular Culture

Authors: Alex Kizuk

Abstract:

This paper studies the trope or meme of the Siren in terms of what long-standing cultural continuities can be found in college classrooms today. Those who have raised children may remember reading from Hans Christian Anderson's 'The Little Mermaid' (1836), not to mention regaling them with colorful Disneyesque versions when they were younger. Though Anderson tempered the darker first ending of the story to give the little mermaid more agency in her salvation—a prognostic developed in Disney adaptations—nonetheless, the tale pivots on an image of a 'heavenly realm' that the mermaid may eventually come to know or comprehend as a beloved woman on dry land. Only after 300 years, however, may she hope to see that 'which lives forever' and 'rises through thin air, up to the shining stars. Just as [sea-people] rise through the water to see the lands on earth.' What students today can see in this example is a trope of the agonistic soul in a hard-won disembarkation at a harbour of knowledge--where the seeker after truth may come to know through persistence (300 years)—all that is good and true concerning human life. This paper discusses several such examples from the Great Books and popular culture to suggest that teaching in the world of the 21st century could do worse than accede to some such perennial seeking.

Keywords: the Great Books, tradition, popular culture, 21st century directions in teaching

Procedia PDF Downloads 141
703 Fuzzy Population-Based Meta-Heuristic Approaches for Attribute Reduction in Rough Set Theory

Authors: Mafarja Majdi, Salwani Abdullah, Najmeh S. Jaddi

Abstract:

One of the global combinatorial optimization problems in machine learning is feature selection. It concerned with removing the irrelevant, noisy, and redundant data, along with keeping the original meaning of the original data. Attribute reduction in rough set theory is an important feature selection method. Since attribute reduction is an NP-hard problem, it is necessary to investigate fast and effective approximate algorithms. In this paper, we proposed two feature selection mechanisms based on memetic algorithms (MAs) which combine the genetic algorithm with a fuzzy record to record travel algorithm and a fuzzy controlled great deluge algorithm to identify a good balance between local search and genetic search. In order to verify the proposed approaches, numerical experiments are carried out on thirteen datasets. The results show that the MAs approaches are efficient in solving attribute reduction problems when compared with other meta-heuristic approaches.

Keywords: rough set theory, attribute reduction, fuzzy logic, memetic algorithms, record to record algorithm, great deluge algorithm

Procedia PDF Downloads 434
702 Study of Behavior Tribological Cutting Tools Based on Coating

Authors: A. Achour L. Chekour, A. Mekroud

Abstract:

Tribology, the science of lubrication, friction and wear, plays an important role in science "crossroads" initiated by the recent developments in the industry. Its multidisciplinary nature reinforces its scientific interest. It covers all the sciences that deal with the contact between two solids loaded and relative motion. It is thus one of the many intersections more clearly established disciplines such as solid mechanics and the fluids, rheological, thermal, materials science and chemistry. As for his experimental approach, it is based on the physical and processing signals and images. The optimization of operating conditions by cutting tool must contribute significantly to the development and productivity of advanced automation of machining techniques because their implementation requires sufficient knowledge of how the process and in particular the evolution of tool wear. In addition, technological advances have developed the use of very hard materials, refractory difficult machinability, requiring highly resistant materials tools. In this study, we present the behavior wear a machining tool during the roughing operation according to the cutting parameters. The interpretation of the experimental results is based mainly on observations and analyzes of sharp edges e tool using the latest techniques: scanning electron microscopy (SEM) and optical rugosimetry laser beam.

Keywords: friction, wear, tool, cutting

Procedia PDF Downloads 317
701 Reconsidering the Legitimacy of Capital Punishment in the Interpretation of the Human Right to Life in the Two Traditional Approaches

Authors: Yujie Zhang

Abstract:

There are debates around the legitimacy of capital punishment, i.e., whether death could serve as a proper execution in our legal system or not. Different arguments have been raised. However, none of them seem able to provide a determined answer to the issue; this results in a lack of instruction in the legal practice. This article, therefore, devotes itself to the effort to find such an answer. It takes the perspective of rights, through interpreting the concept of right to life, which capital punishment appears to be in confliction with in the two traditional approaches, to reveal a possibly best account of the right and its conclusion on capital punishment. However, this effort is not a normative one which focuses on what ought to be. It means the article does not try to work out which argument we should choose and solve the hot debate on whether capital punishment should be allowed or not. It, again, does not propose which perspective we should take to approach this issue or generally which account of right must be better; rather, it is more a thought experiment. It attempts to raise a new perspective to approach the issue of the legitimacy of capital punishment. Both its perspective and conclusion therefore are tentative: what if we view this issue in a way we have never tried before, for example the different accounts of right to life? In this sense, the perspective could be defied, while the conclusion could be rejected. Other perspectives and conclusions are also possible. Notwithstanding, this tentative perspective and account of the right still could not be denied from serving as a potential approach, since it does have the ability to provide us with a determined attitude toward capital punishment that is hard to achieve through existing arguments.

Keywords: capital punishment, right to life, theories of rights, the choice theory

Procedia PDF Downloads 177
700 Biostabilisation of Sediments for the Protection of Marine Infrastructure from Scour

Authors: Rob Schindler

Abstract:

Industry-standard methods of mitigating erosion of seabed sediments rely on ‘hard engineering’ approaches which have numerous environmental shortcomings: (1) direct loss of habitat by smothering of benthic species, (2) disruption of sediment transport processes, damaging geomorphic and ecosystem functionality (3) generation of secondary erosion problems, (4) introduction of material that may propagate non-local species, and (5) provision of pathways for the spread of invasive species. Recent studies have also revealed the importance of biological cohesion, the result of naturally occurring extra-cellular polymeric substances (EPS), in stabilizing natural sediments. Mimicking the strong bonding kinetics through the deliberate addition of EPS to sediments – henceforth termed ‘biostabilisation’ - offers a means in which to mitigate against erosion induced by structures or episodic increases in hydrodynamic forcing (e.g. storms and floods) whilst avoiding, or reducing, hard engineering. Here we present unique experiments that systematically examine how biostabilisation reduces scour around a monopile in a current, a first step to realizing the potential of this new method of scouring reduction for a wide range of engineering purposes in aquatic substrates. Experiments were performed in Plymouth University’s recirculating sediment flume which includes a recessed scour pit. The model monopile was 0.048 m in diameter, D. Assuming a prototype monopile diameter of 2.0 m yields a geometric ratio of 41.67. When applied to a 10 m prototype water depth this yields a model depth, d, of 0.24 m. The sediment pit containing the monopile was filled with different biostabilised substrata prepared using a mixture of fine sand (D50 = 230 μm) and EPS (Xanthan gum). Nine sand-EPS mixtures were examined spanning EPS contents of 0.0% < b0 < 0.50%. Scour development was measured using a laser point gauge along a 530 mm centreline at 10 mm increments at regular periods over 5 h. Maximum scour depth and excavated area were determined at different time steps and plotted against time to yield equilibrium values. After 5 hours the current was stopped and a detailed scan of the final scour morphology was taken. Results show that increasing EPS content causes a progressive reduction in the equilibrium depth and lateral extent of scour, and hence excavated material. Very small amounts equating to natural communities (< 0.1% by mass) reduce scour rate, depth and extent of scour around monopiles. Furthermore, the strong linear relationships between EPS content, equilibrium scour depth, excavation area and timescales of scouring offer a simple index on which to modify existing scour prediction methods. We conclude that the biostabilisation of sediments with EPS may offer a simple, cost-effective and ecologically sensitive means of reducing scour in a range of contexts including OWFs, bridge piers, pipeline installation, and void filling in rock armour. Biostabilisation may also reduce economic costs through (1) Use of existing site sediments, or waste dredged sediments (2) Reduced fabrication of materials, (3) Lower transport costs, (4) Less dependence on specialist vessels and precise sub-sea assembly. Further, its potential environmental credentials may allow sensitive use of the seabed in marine protection zones across the globe.

Keywords: biostabilisation, EPS, marine, scour

Procedia PDF Downloads 151
699 Investigating The Nexus Between Energy Deficiency, Environmental Sustainability and Renewable Energy: The Role of Energy Trade in Global Perspectives

Authors: Fahim Ullah, Muhammad Usman

Abstract:

Energy consumption and environmental sustainability are hard challenges of 21st century. Energy richness increases environmental pollution while energy poverty hinders economic growth. Considering these two aspects, present study calculates energy deficiency and examines the role of renewable energy to overcome rising energy deficiency and carbon emission for selected countries from 1990 to 2021. For empirical analysis, this study uses methods of moments panel quantile regression analysis and to check the robustness, study used panel quantile robust analysis. Graphical analysis indicated rising global energy deficiency since last three decades where energy consumption is higher than energy production. Empirical results showed that renewable energy is a significant factor for reducing energy deficiency. Secondly, the energy deficiency increases carbon emission level and again renewable energy decreases emissions level. This study recommends that global energy deficiency and rising carbon emissions can be controlled through structural change in the form of energy transition to replace non-renewable resources with renewable resources.

Keywords: energy deficiency, renewable energy, carbon emission, energy trade, PQL analysis

Procedia PDF Downloads 39
698 Relay Node Placement for Connectivity Restoration in Wireless Sensor Networks Using Genetic Algorithms

Authors: Hanieh Tarbiat Khosrowshahi, Mojtaba Shakeri

Abstract:

Wireless Sensor Networks (WSNs) consist of a set of sensor nodes with limited capability. WSNs may suffer from multiple node failures when they are exposed to harsh environments such as military zones or disaster locations and lose connectivity by getting partitioned into disjoint segments. Relay nodes (RNs) are alternatively introduced to restore connectivity. They cost more than sensors as they benefit from mobility, more power and more transmission range, enforcing a minimum number of them to be used. This paper addresses the problem of RN placement in a multiple disjoint network by developing a genetic algorithm (GA). The problem is reintroduced as the Steiner tree problem (which is known to be an NP-hard problem) by the aim of finding the minimum number of Steiner points where RNs are to be placed for restoring connectivity. An upper bound to the number of RNs is first computed to set up the length of initial chromosomes. The GA algorithm then iteratively reduces the number of RNs and determines their location at the same time. Experimental results indicate that the proposed GA is capable of establishing network connectivity using a reasonable number of RNs compared to the best existing work.

Keywords: connectivity restoration, genetic algorithms, multiple-node failure, relay nodes, wireless sensor networks

Procedia PDF Downloads 222
697 Fund Seekers’ Deception in Peer-to-Peer Lending in Times of COVID

Authors: Olivier Mesly

Abstract:

This article examines the likelihood of deception on the part of borrowers wishing to obtain credit from institutional or private lenders. In our first study, we identify five explanatory variables that account for nearly forty percent of the propensity to act deceitfully: a poor credit history, debt, risky behavior, and to a much lesser degree, irrational behavior and disconnection from the bundle of needs, goals, and preferences. For the second study, we remodeled the initial questionnaire to adapt it to the needs of institutional bankers and borrowers, especially those that engage in money on-line peer-to-peer lending, a growing business fueled by the COVID pandemic. We find that the three key psychological variables that help to indirectly predict the likelihood of deceitful behaviors and possible default on loan reimbursement, i.e., risky behaviors, ir-rationality, and dis-connection, interact with each other to form a loop. This study presents two benefits: first, we provide evidence that it is to some degree possible to tighten control over lending practices. Second, we offer a pragmatic tool: a questionnaire, that lenders can use or adapt to gauge potential borrowers’ deceit, notably by combining their results with standard hard-data measures of risk.

Keywords: bundle of needs, default, debt, deception, risk, peer-to-peer lending

Procedia PDF Downloads 115
696 Auditor with the Javanese Characters: Revealing the Relationship towards Its Client

Authors: Krisna Damayanti

Abstract:

Negative issue about the relationship between auditors and clients often heard. It arises in view of the rise of a variety of phenomena resulting from the audit practice of greed and do not appreciate the independence of the audit profession and professional code of ethics. It is a logical consequence of the practice of capitalism in accounting. The purpose of this paper would like to uncover the existing auditing practices in Indonesia, especially Java, which is associated with a strong influence of Javanese culture with reluctant/"shy", politely, "legowo", "ngemong" friendly, "not mentholo", "tepo seliro", "ngajeni", "acquiescent". The method used by interpretive approach that emphasizes the role of language, interpret and understand and see social reality as something other than a label, name or concept. Auditing practices in each country has a culture that will affect the standard set by those regulatory standards although there has been an adaptation of IAS. In Indonesia the majority of parties dominated by Javanesse racial regulators, so Java culture is embedded in every audit practices thus conditions in Java requires auditors to behave like that, sometimes interfere with standard Java code of conduct that must be executed by an auditor. Auditors who live in Java have the characters of Javanese culture that is hard to avoid in the audit practice. However, in practice, the auditor still are relevant in their profession.

Keywords: auditors, java, character, profession, code of ethics, client

Procedia PDF Downloads 421
695 Using the Weakest Precondition to Achieve Self-Stabilization in Critical Networks

Authors: Antonio Pizzarello, Oris Friesen

Abstract:

Networks, such as the electric power grid, must demonstrate exemplary performance and integrity. Integrity depends on the quality of both the system design model and the deployed software. Integrity of the deployed software is key, for both the original versions and the many that occur throughout numerous maintenance activity. Current software engineering technology and practice do not produce adequate integrity. Distributed systems utilize networks where each node is an independent computer system. The connections between them is realized via a network that is normally redundantly connected to guarantee the presence of a path between two nodes in the case of failure of some branch. Furthermore, at each node, there is software which may fail. Self-stabilizing protocols are usually present that recognize failure in the network and perform a repair action that will bring the node back to a correct state. These protocols first introduced by E. W. Dijkstra are currently present in almost all Ethernets. Super stabilization protocols capable of reacting to a change in the network topology due to the removal or addition of a branch in the network are less common but are theoretically defined and available. This paper describes how to use the Software Integrity Assessment (SIA) methodology to analyze self-stabilizing software. SIA is based on the UNITY formalism for parallel and distributed programming, which allows the analysis of code for verifying the progress property p leads-to q that describes the progress of all computations starting in a state satisfying p to a state satisfying q via the execution of one or more system modules. As opposed to demonstrably inadequate test and evaluation methods SIA allows the analysis and verification of any network self-stabilizing software as well as any other software that is designed to recover from failure without external intervention of maintenance personnel. The model to be analyzed is obtained by automatic translation of the system code to a transition system that is based on the use of the weakest precondition.

Keywords: network, power grid, self-stabilization, software integrity assessment, UNITY, weakest precondition

Procedia PDF Downloads 206
694 Types of Neurons in the Spinal Trigeminal Nucleus of the Camel Brain: Golgi Study

Authors: Qasim A. El Dwairi, Saleh M. Banihani, Ayat S. Banihani, Ziad M. Bataineh

Abstract:

Neurons in the spinal trigeminal nucleus of the camel were studied by Golgi impregnation. Neurons were classified based on differences in size and shape of their cell bodies, density of their dendritic trees, morphology and distribution of their appendages. In the spinal trigeminal nucleus of the camel, at least twelve types of neurons were identified. These neurons include, stalked, islets, octubus-like, lobulated, boat-like, pyramidal, multipolar, round, oval and elongated neurons. They have large number of different forms of appendages not only for their dendrites but also for their cell bodies. Neurons with unique large dilatations especially at their dendritic branching points were found. The morphological features of these neurons were described and compared with their counterparts in other species. Finding of large number of neuronal types with different size and shapes and large number of different forms of appendages for cell bodies and dendrites together with the presence of cells with unique features such as large dilated parts for dendrites may indicate to a very complex information processing for pain and temperature at the level of the spinal trigeminal nucleus in the camel that traditionally live in a very hard environment (the desert).

Keywords: camel, golgi, neurons , spinal trigeminal nucleus

Procedia PDF Downloads 320
693 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500

Authors: Mustafa Elfituri, Jonathan Cook

Abstract:

Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.

Keywords: graph computation, graph500 benchmark, parallel architectures, parallel programming, workload characterization.

Procedia PDF Downloads 127