sil

Forum Replies Created

Viewing 15 posts - 1 through 15 (of 504 total)
  • Author
    Posts
  • #53771
     sil 
    Participant

    @infosec703 wrote:

    I work on a very large privacy project and wanted to get folks take on how you feel about the Fair Information Practice Principles and privacy when it comes to ethical hacking. I’ve been a part of systems requirements and systems development and the FIPPs are the core of my work on both fronts. Has anyone else considered the FIPPs and privacy when they conduct their business?

    I am also hosting an international privacy symposium this summer – I am working to incorporate cyber security into the overall theme – if anyone has any suggestions, I would love to hear them.

    Thanks in advance.

    The issue with FIPPs, and other “frameworks” is usually, they are very outdated. Think about that very thoroughly. It is 40 years old (http://itlaw.wikia.com/wiki/Privacy_Act_of_1974), and technology differed back then. The threats differed, vulnerabilities differed.

    At the core of “ethical hacking” if I am tasked with discovering vulnerabilities, there is a high likelihood, I am going to trample all over FIPPs style frameworks:

    There must be a way for an individual to prevent information about him or her that was obtained for one purpose from being used or made available for other purposes without his or her consent. (purpose limitation)

    At the core of this FIPP statement, no one is giving me a consent as a tester, to access their information. The company storing data is having me test it. How do you solve this paradox?

    Frameworks as a whole, are started with good intent, but are often so broad, they become self-defeating. For example, if you look at a PCI transaction, you have data in transit, and data at rest. BOTH can be exploited to some degree (MITM the wire), decrypt stored data. There is NO workaround for these facts. So what do professionals do? They apply bandaids: “implement stronger SSL, encrypt with uber ciphers” but they are not addressing the problem, they are merely delaying (slowing down an attacker).

    Strong security needs to begin at the core protocols (OSI layers), where something is going through an SDLC phase, prototype-to-market phase, but the reality is, technology changes so fast, this is not feasible, on any scale. The “thinkers” need to re-think their game plans because by the time you write up any framework, the next best thing comes along, and the framework is then useless. Let alone a 40 year old framework.

    Just my .02

  • #53626
     sil 
    Participant

    I think I will get the… I don’t even know anymore. Certs bore me now 😉

  • #44582
     sil 
    Participant

    I’ve been swamped with stuff but began a six week class via Stanford

  • #47625
     sil 
    Participant

    @zeroone wrote:

    That line cracked me up ;D

    Zero switchport security (their entire innards were Cisco down)

  • #47623
     sil 
    Participant

    @alucian wrote:

    What stroked me is that the auditory ones will not start a project unless they have at least 80 – 90 % of the information and skills.

    I will share with you guys a gig I did about 3 weeks ago. Went to another state to perform an assessment slash test against a videoconferencing system. Client is a financial trading information powerhouse who’s revenue is in the billions. Premise for the test: “We get on conf calls with the SEC, we want to make sure our conference is secure, untappable, etc. we are using X system” Nothing else was given to me.

    I was NOT able to arbitrarily plug in anything without their IT staff getting a whiff of things and literally running to the location were a device was plugged in. I had zero knowledge of the infrastructure outside of: “this is the vendor we use, this is how we make these teleconference calls….”

    Under 5 minutes… Trusted laptop on the network, bootable operating system, no DHCP, sniffing the network. Seriously? … Nice MAC addresses flying by in tcpdump, think I will take one. No MIS guys running to find a rogue device. Teleconferencing? Game over. Credentials were horrible. Gone in under 3-5 minutes. Could I have escalated – sure, but I was only there focusing on the video/voip side of the equation but I mentioned it to them.

    Moral of the story: Know your systems and protocols. Had I not understood how voice and video worked, I would likely be intimidated and not known where to begin. Had I not understood how switching, routing, VLANs work – I would not have been able to sniff, hijack a MAC and get on the network. Had I not understood matters of timing, any password cracking would have been detected from excess packets flooding the network. Had I not the ingenuity to created a quick targeted wordlist, I would not have gotten the password and credentials. I sat down and in less than a maximum of about 15 minutes, I had access to do whatever an admin did to their teleconferencing system. As an attacker I could have re-routed the registrar to a rogue server, recorded the calls, took pictures of anyone in a call and so on. What’s the big deal you ask? Imagine a conference call before earnings are reported where I was recording. One could make millions, take a company out of business, and so on and so forth.

    End of the day, I made my report based on 2 days at the client. I was not allowed to perform a full blown penetration test as many departments had to be involved and the original individual tasked with the test was out of office so the coordination to do the test never came to fruition. They however were spooked enough to understand I needed to really go no further from there. On a conf call with an entire security team, many of whom are visible in the industry (I know of them, the books they’ve written, what colleges they TEACH at, etc.), not one challenged me on anything I said. I was able to explain the technical risk and swap into the management scope of risk management.

    Experience is everything. Not a cert, not a college. When you’re comfortable standing your ground with any security engineer, then you’re ready to do consulting on your own. When you don’t necessarily need to do any research in a quick scenario like this, then you’re at your at the top of the game. I am fortunate enough to be such a pain in the … that I have been able to collaborate with, talk with, learn from some of the top in the industry (and I mean top). This comes from years upon years upon years of studying and dabbling in the industry.

    As for money, comes with the territory. Its not everything, there are times I am more curious and in a tinkering kind of mode for the sake of STILL learning something. Those times I can lower a price if I see a benefit (learning something new, testing a unique environment, etc.) Last thing I do though, is ever bite off more than I can chew. If I have trouble understanding a concept, technology, I take a step back rather than make an idiot out of myself pretending to be able to do something I can’t. I had to avoid a test that was out of my league that involved satellites, yachts (really big mega millionaire type yachts) and a whole bunch of marine communications. I had to avoid an ATM (airline traffic management) test because its a whole different ballgame. Know your limits, be truthful with yourself. If you have to ask one too many questions and are shaky going into an environment, you might not be ready for this type of work yet.

  • #47618
     sil 
    Participant

    @ajohnson wrote:

    A penetration test is intended to provide reasonable assurance within the scope that it’s defined. The length of the engagement, testers’ knowledge/experience, and other limiting factors (i.e. removing critical systems from the scope, using a test/dev network instead of production, disallowing social engineering/client-side attacks, etc.) all factor into the equation. Your lawyer/legal team should explicitly state that this is a “best-effort” service within the contract, and you (and/or sales reps/PMs) should also clearly communicate the level of assurance such an engagement will provide to the client.

    On the flip side of this, the reality is, if someone compromised a glaring vulnerability, chances are the tester sucked (for lack of better words). Sorry I call it how I see it. In an instance where this occurs (you do a test and leave gaping holes) kiss any future business goodbye as well as introducing a huge black eye on your company. http://mailman.nanog.org/pipermail/nanog/2012-June/048786.html

  • #47546
     sil 
    Participant
  • #47555
     sil 
    Participant

    @samoletmaj wrote:

    Yes it is very funny…

    i have tried pretty much every kernel exploit out there written for my specific kernel and it has worked for a ton of guys, just not me! I’ve been at it for about 6 days off the uname and nada…

    i’m moving on to another angle, and i am waiting for a response from IACRB… i will post up.

    Sometimes it not about finding a kernel level exploit that matches your EXACT kernel. Sometimes its about finding one that causes an outcome and investigating from there. E.g., If you’re on say 2.4.20 and you can find something like a 2.6 exploit, try to see what occurs when running. Does it crash an application, if so can you figure out what its doing via gdb.

  • #47553
     sil 
    Participant

    @sephstorm wrote:

    aweSEC is correct, You certainly can elevate your privileges on that box. Honestly, I am a little skeptical that you cracked the root PW with john. The (root) PW’s for my exam were not in any wordlist I used, and bruteforce ran for nearly a week before I shut it off.

    Weird how we all have different experiences. I got the password in under an hour (maybe 30ish or so minutes). Local “kernel” compromise/escalation (spoiler alert there) took me about 10 minutes off the uname.

    I try to tell / write / inform testers, one needs to literally get a storage device, create dirs (Windows, Linux, Solaris, etc) then subdivide those folders into something like Local/Remote further subdivided again:

    Exploits/Windows/Local/XP/Microsoft
    Exploits/Windows/Local/XP/3rdParty/RealPlayer

    Exploits/Unix/Linux/Local/Kernel/2.6.13
    Exploits/Unix/Linux/Local/Kernel/2.6.20
    Exploits/Unix/Linux/Local/Kernel/2.6General

    Then go from there. On real world scenarios it cuts down so much time. If you REALLY wanna be spiffy  about it, get yourself some cloud storage so you can ALWAYS download your tools at will

  • #47542
     sil 
    Participant

    They don’t need to make their own malware, flood the market to sell the products. The approach is wrong. In order to understand this, you would need to go to http://maec.mitre.org and understand a lot of what’s going on. In a nutshell this is the issue:

    Malware Signature
    1 + 1 = 2

    Attacker
    one + 1 = 2

    New Malware Signature
    one + 1 = 2

    Same attack + attacker
    one plus one equals 2

    New Malware Signature
    one plus one equals 2

    Same attack + attacker
    b25lIHBsdXMgb25l

    No matter how they want to attack the heuristics, its a guessing game based on what they KNOW. They can never see/know/understand an attacker so there is a lot of assumption based on known knowns. So attackers will ALWAYS have an upper hand. The keys isn’t to rely on malware/AV companies, the key is to understanding your network, applications and patterns. E.g., any baseline traffic would yield anomalies in sites visited, bandwidth consumed and so forth. You start seeing things leave your network destined for say China at 3am… Its something you should be quick to look at. Same applies for ANY connection LEAVING your network when say, there is no one on a particular machine. HIPS also help here but running say Tripwire or Samhain in an enterprise can be a headache

  • #47540
     sil 
    Participant
  • #44201
     sil 
    Participant

    You know… In taking a look at mile2 followers, I can understand why we’re seeing an influx of these posts. (http://www.linkedin.com/company/mile2/followers?page_num=2) Seems like mile2 is likely flooding Indian markets with their content/exam. I could respect that. They couldn’t really make it over here in the states where we pretty much laugh them off, so they market it elsewhere. Whatever sells. End of the day, reality is, both mile2 and EC-Council are re-hashes of “not much.” (Opinion – Is what it is sorry)

  • #44200
     sil 
    Participant

    I was gonna go for the CDFE myself then I realized, I don’t have that problem

    http://www.nefootankle.com/newsdesk/practice-news/comprehensive-diabetic-foot-exam-cdfe/

  • #47515
     sil 
    Participant

    @apollo wrote:

    What sort of things would you like to see with c/c++/asm ?  I’m pretty sure we can build a whole ‘nother book out of that.  The only things that I’ve been using c/c++ for lately are for network tools that require serious speed (ettercap/skipfish). 

    Skipfish is a brute if I ever saw one. As for the book, haven’t read it yet so I can’t comment but judging from the review, I may pick it up some time. Now as for what I would want to see in these types of books, differs from what many would want to see.

    Assembly as a whole is a huge monster to cover let alone getting into the dissection, disassembly and debugging of it. I would want a book that goes more in depth on the frameworking of exploitation. Not: “look at the cool NOP sledding!” Something that blankets protections (ASLR/DEP/SEH/etc) which again, would likely be three books.

    Have you guys thought about doing something like an interactive eBook online? Think about that. Chapter based book, where via subscription, one learns and performs samples via the web. Perhaps with audio/video walkthroughs. Would be akin to a challenge in the sense that in order for someone to get to the next chapter, they’d have to finish and fully understand the chapter they’re on. You could price it the cost of a book for 6mos of access while the reader not only goes through walkthroughs, but can watch a video/hear an audio description of what you’re talking about. Something like this has not been done from all I have seen. Would be cool for ADHD/ADD driven security slackers such as myself.

  • #47429
     sil 
    Participant

    It depends, for high availability, I love Stonegates. They have the ability to keep a VoIP call up and running even if one provider on an interface goes down. I favor Juniper over Cisco because overall they play better with most equipment outside of their own brand. I also like Palo Alto, but they can be pricey. End of the day though, in a managed security service arena, one gets used to them all. So I have no issue dealing with most. I do have my preference when I am the designer.

Viewing 15 posts - 1 through 15 (of 504 total)

Copyright ©2019 Caendra, Inc.

Contact Us

Thoughts, suggestions, issues? Send us an email, and we'll get back to you.

Sending

Sign in with Caendra

Forgot password?Sign up

Forgot your details?